Skip to content

Instantly share code, notes, and snippets.

@benjiao
Last active August 15, 2019 12:26
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save benjiao/a86581472d4925f61a314f5dadf2345c to your computer and use it in GitHub Desktop.
Save benjiao/a86581472d4925f61a314f5dadf2345c to your computer and use it in GitHub Desktop.
import re
import time
import json
import psutil
from slackclient import SlackClient
slack_client = SlackClient("xoxb-103696790404-jv1XDqw2w5dezNWZy0K5ykdG")
# Fetch your Bot's User ID
user_list = slack_client.api_call("users.list")
for user in user_list.get('members'):
if user.get('name') == "pibot":
slack_user_id = user.get('id')
break
# Start connection
if slack_client.rtm_connect():
print "Connected!"
while True:
for message in slack_client.rtm_read():
if 'text' in message and message['text'].startswith("<@%s>" % slack_user_id):
print "Message received: %s" % json.dumps(message, indent=2)
message_text = message['text'].\
split("<@%s>" % slack_user_id)[1].\
strip()
if re.match(r'.*(cpu).*', message_text, re.IGNORECASE):
cpu_pct = psutil.cpu_percent(interval=1, percpu=False)
slack_client.api_call(
"chat.postMessage",
channel=message['channel'],
text="My CPU is at %s%%." % cpu_pct,
as_user=True)
if re.match(r'.*(memory|ram).*', message_text, re.IGNORECASE):
mem = psutil.virtual_memory()
mem_pct = mem.percent
slack_client.api_call(
"chat.postMessage",
channel=message['channel'],
text="My RAM is at %s%%." % mem_pct,
as_user=True)
time.sleep(1)
@danflurry
Copy link

This is great. Helped me get a related Python script interacting with my Slackbot. My script waits for a new message to be sent to the bot, and then does some processing and GPIO actions on the Pi. I would like to make it so that while a message is being processed, another incoming message could be stored in a queue to be processed after the first process is complete. Any ideas on how to accomplish this?

@itsaandy
Copy link

@danflurry try using threads so you can keep listening to more requests while you process something.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment