Skip to content

Instantly share code, notes, and snippets.

@browny
Created February 12, 2020 20:14
Show Gist options
  • Save browny/08297956336fd4834bba63bd58d5b4ae to your computer and use it in GitHub Desktop.
Save browny/08297956336fd4834bba63bd58d5b4ae to your computer and use it in GitHub Desktop.
Labs of PubSub101

Lab1. Publish Streaming Data into Pub/Sub

  1. Preparation
gcloud compute instances create training-vm \
  --zone=asia-east1-b --machine-type=n1-standard-1 \
  --metadata=startup-script-url=gs://cloud-training/initscripts/init-script-sdp-1.sh \
  --scopes=https://www.googleapis.com/auth/cloud-platform \
  --image-family=debian-9 --image-project=debian-cloud

ls /training
pip --version

cp -r /training/training-data-analyst/ .

export DEVSHELL_PROJECT_ID=$(gcloud config get-value project)
  1. Create Pub/Sub topic and subscription
cd ~/training-data-analyst/courses/streaming/publish
gcloud pubsub topics create sandiego
gcloud pubsub topics publish sandiego --message "hello"

gcloud pubsub subscriptions create --topic sandiego mySub1
gcloud pubsub subscriptions pull --auto-ack mySub1

gcloud pubsub topics publish sandiego --message "hello again"
gcloud pubsub subscriptions pull --auto-ack mySub1

gcloud pubsub subscriptions delete mySub1
  1. Simulate traffic sensor data into Pub/Sub
cd ~/training-data-analyst/courses/streaming/publish
nano send_sensor_data.py

./download_data.sh

sudo apt-get install -y python-pip
sudo pip install -U google-cloud-pubsub

./send_sensor_data.py --speedFactor=60 --project $DEVSHELL_PROJECT_ID
  1. Verify that messages are received
# another SSH tab
cd ~/training-data-analyst/courses/streaming/publish

gcloud pubsub subscriptions create --topic sandiego mySub2
gcloud pubsub subscriptions pull --auto-ack mySub2
# gcloud pubsub subscriptions pull --auto-ack mySub2 --limit 100

gcloud pubsub subscriptions delete mySub2
exit
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment