Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

destalinator from docker throws an encoding='utf-8' error when using Python 3.6 #161

Open
ahalasz opened this issue Dec 8, 2017 · 6 comments · May be fixed by #211
Open

destalinator from docker throws an encoding='utf-8' error when using Python 3.6 #161

ahalasz opened this issue Dec 8, 2017 · 6 comments · May be fixed by #211
Labels

Comments

@ahalasz
Copy link

ahalasz commented Dec 8, 2017

I installed Destalinator as a Docker image, configured the configuration.yaml file with the right slack_name, added the mandatory environment variables in a separate .env file (myvars.env) that I call when I run the container.
Then I noticed that nothing was happening in the Slack channels and the coverage was of 70%.

I thus went in the running Docker container and run ./warner.sh, which gives the following output

/destalinator # ./warner.py
2017-12-08 12:21:15,880 [DEBUG]: Logging to slack channel: destalinator-log
2017-12-08 12:21:15,881 [DEBUG]: activated is TRUE
2017-12-08 12:21:15,886 [INFO]: Starting new HTTPS connection (1): slacktest.slack.com #name changed
2017-12-08 12:21:16,004 [DEBUG]: "GET /api/users.list?token= HTTP/1.1" 200 1299
2017-12-08 12:21:16,005 [DEBUG]: All restricted user names:
2017-12-08 12:21:16,079 [DEBUG]: "GET /api/channels.list?exclude_archived=1&token= HTTP/1.1" 200 635
2017-12-08 12:21:16,080 [DEBUG]: activated is TRUE
2017-12-08 12:21:16,082 [INFO]: Warning
2017-12-08 12:21:16,082 [INFO]: *ACTION: Warning all channels stale for more than 30 days*
2017-12-08 12:21:16,082 [DEBUG]: Not warning #destalinator-log because it's in ignore_channels
2017-12-08 12:21:16,160 [DEBUG]: "GET /api/channels.info?token=&channel=C1EA01JR1 HTTP/1.1" 200 405
2017-12-08 12:21:16,209 [DEBUG]: "GET /api/channels.info?token=&channel=C1EA01JR1 HTTP/1.1" 200 405
2017-12-08 12:21:16,210 [DEBUG]: Current members in general are {'U1EA01HH9', 'U5FFN9XPA'}
2017-12-08 12:21:16,285 [DEBUG]: "GET /api/channels.history?oldest=1510143676&token=&channel=C1EA01JR1&latest=       1512735676 HTTP/1.1" 200 146
2017-12-08 12:21:16,286 [DEBUG]: Fetched 1 messages for #general over 30 days
2017-12-08 12:21:16,286 [DEBUG]: Filtered down to 1 messages based on included_subtypes: bot_message, channel_name, channel_purpose, channel_topic, file_mention, file_share, me_message,        message_replied, reply_broadcast, slackbot_response
2017-12-08 12:21:16,286 [DEBUG]: Purging cache for general
2017-12-08 12:21:16,358 [DEBUG]: "GET /api/channels.info?token=&channel=C1EA01K0B HTTP/1.1" 200 463
2017-12-08 12:21:16,430 [DEBUG]: "GET /api/channels.info?token=&channel=C1EA01K0B HTTP/1.1" 200 463
2017-12-08 12:21:16,431 [DEBUG]: Current members in random are {'U1EA01HH9', 'U5FFN9XPA'}
2017-12-08 12:21:16,477 [DEBUG]: "GET /api/channels.history?oldest=1510143676&token=&channel=C1EA01K0B&latest=       1512735676 HTTP/1.1" 200 94
2017-12-08 12:21:16,478 [DEBUG]: Fetched 0 messages for #random over 30 days
2017-12-08 12:21:16,478 [DEBUG]: Filtered down to 0 messages based on included_subtypes: bot_message, channel_name, channel_purpose, channel_topic, file_mention, file_share, me_message,        message_replied, reply_broadcast, slackbot_response
2017-12-08 12:21:16,553 [DEBUG]: "GET /api/channels.info?token=&channel=C1EA01K0B HTTP/1.1" 200 463
2017-12-08 12:21:16,553 [DEBUG]: Current members in random are {'U1EA01HH9', 'U5FFN9XPA'}
2017-12-08 12:21:16,553 [DEBUG]: Returning 0 cached messages for #random over 30 days
Traceback (most recent call last):
  File "./warner.py", line 18, in <module>
    Warner().warn(force_warn=force_warn)
  File "./warner.py", line 11, in warn
    self.ds.warn_all(self.config.warn_threshold, force_warn)
  File "/destalinator/destalinator.py", line 235, in warn_all
    if self.warn(channel, days, force_warn):
  File "/destalinator/destalinator.py", line 217, in warn
    self.post_marked_up_message(channel_name, self.warning_text, message_type='channel_warning')
  File "/destalinator/destalinator.py", line 105, in post_marked_up_message
    self.slacker.post_message(channel_name, self.add_slack_channel_markup(message), **kwargs)
  File "/destalinator/slacker.py", line 259, in post_message
    post_data['attachments'] = json.dumps([{'fallback': message_type}], encoding='utf-8')
  File "/usr/local/lib/python3.6/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
TypeError: __init__() got an unexpected keyword argument 'encoding'

I removed the api_token from the above output and changed the slack workspace name.
As you can see it seems like it connect to the desired Slack Workspace, but it throws an error at the end (which might be related to the fact that there are 0 cached messages in the last 30 days?).

I run the docker container with the command
docker run --env-file myvars.env -it -d -p 8080:80 --name=destalinator <image name> sh -c "coverage html --skip-covered && python -m http.server 80"

@kmarekspartz
Copy link
Contributor

That command is for viewing the unit test coverage.You should be able to run docker run --env-file myvars.env -it -d --name=destalinator <image name> on its own. Adding DESTALINATOR_RUN_ONCE to you env file will help with debugging and skip the scheduler.

I don't think this is related to that traceback, however. From the Python docs (https://docs.python.org/3/library/json.html), it looks like json.loads can take an encoding kwarg, but json.dumps cannot.

We're setting encoding for json.dumps here: https://github.com/randsleadershipslack/destalinator/blob/master/slacker.py#L259

That's blocked by a conditional on message_type, but it looks like we're setting that every time we call in. The relevant trace lines are:

    self.post_marked_up_message(channel_name, self.warning_text, message_type='channel_warning')
  File "/destalinator/destalinator.py", line 105, in post_marked_up_message
    self.slacker.post_message(channel_name, self.add_slack_channel_markup(message), **kwargs)
  File "/destalinator/slacker.py", line 259, in post_message
    post_data['attachments'] = json.dumps([{'fallback': message_type}], encoding='utf-8')```

Unsure why this hasn't been a problem before, but I suspect dropping the `, encoding='utf-8'` from that line will fix this.

@kmarekspartz
Copy link
Contributor

Ah! Python 2.7 does accept an encoding kwarg here: https://docs.python.org/2/library/json.html

@TheConnMan have you ran into this with the docker deployment?

@ahalasz ahalasz changed the title destalinator throws an error when configured to a slack that has 0 messages in the last 30days destalinator from docker throws an encoding='utf-8' error Dec 8, 2017
@TheConnMan
Copy link
Contributor

I just did, yes. I hadn't gotten around to actually running the Dockerized version until today. Dropping the base Docker image down to python:2.7 works like a charm. I'll submit a PR in a few hours.

TheConnMan added a commit to TheConnMan/destalinator that referenced this issue Dec 8, 2017
@kmarekspartz kmarekspartz reopened this Dec 8, 2017
@kmarekspartz kmarekspartz changed the title destalinator from docker throws an encoding='utf-8' error destalinator from docker throws an encoding='utf-8' error when using Python 3.6 Dec 8, 2017
@kmarekspartz
Copy link
Contributor

@ahalasz This should be fixed, as the docker image is using python2.7 now. I'm keeping this issue open until we fix 3.6. Thanks for reporting this!

@ahalasz
Copy link
Author

ahalasz commented Dec 11, 2017

Thanks it's working now. There is a typo in your Docker file: the Python image should be python:2.7 and not python:alpine2.7

@kmarekspartz
Copy link
Contributor

kmarekspartz commented Dec 11, 2017

Yes, or python:2.7-alpine, though it seems to complain about cffi. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants