Skip to content

Instantly share code, notes, and snippets.

@amit08255
Last active June 17, 2023 07:07
Show Gist options
  • Save amit08255/da0c8a5c1d73aa190c57ff41b9dda136 to your computer and use it in GitHub Desktop.
Save amit08255/da0c8a5c1d73aa190c57ff41b9dda136 to your computer and use it in GitHub Desktop.
GCP Cheatsheet

GCP Cheatsheet

Connecting to GCP compute instance using Cloud shell

  1. Switch to project (you can see project ID on top left near search bar):
gcloud config set project <project name>
  1. Connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name"  --project "project-name"

Example:

gcloud beta compute ssh --zone "us-central1-a" "instance-1"  --project "clgcporg8-052"

Copy files from GCP instance to local

  1. Activate cloud shell and use below command to copy file in local:
gcloud compute scp --zone "instance_zone_name" vm_instance_name:path_from_copy local_path_to_copy

To copy folder use:

gcloud compute scp --recurse --zone "instance_zone_name" vm_instance_name:path_from_copy local_path_to_copy

Example:

gcloud compute scp --zone "us-central1-a" instance-1:~/node/index.js ~

Installing Nginx on Linux VM

sudo apt-get update
sudo apt install nginx -y

Installing NodeJS on Linux VM

cd ~
curl -sL https://deb.nodesource.com/setup_16.x -o /tmp/nodesource_setup.sh
sudo bash /tmp/nodesource_setup.sh
sudo apt install nodejs

Deploying NextJS on GCP VM (Use PM2 to keep server running in background)

  1. Setup NextJS project:
npx create-next-app@latest
  1. Install nginx:
sudo apt-get update
sudo apt install nginx -y
  1. Setup Nginx to proxy NextJS from port 3000 to 80:
cd /etc/nginx/sites-available
sudo mv default def.bak
sudo vim default

Paste below code in config:

server {
      listen       80;
      listen       [::]:80;
      location / {
              # reverse proxy for next server
              proxy_pass http://localhost:3000;
              proxy_http_version 1.1;
              proxy_set_header Upgrade $http_upgrade;
              proxy_set_header Connection 'upgrade';
              proxy_set_header Host $host;
              proxy_cache_bypass $http_upgrade;
      }
}

Restart Nginx:

sudo systemctl restart nginx
  1. Install pm2 using below command:
sudo npm install pm2@latest -g
  1. Build NextJS application:
npm run build
  1. Run below command to run start script:
pm2 start npm --name "nextapp" -- start
  1. Check status of running app process using command:
pm2 status
  1. Stop process using command:
pm2 stop nextapp
  1. To update app, rebuild application and restart process using command:
pm2 restart nextapp

Setup Jenkins on GCP VM

  1. Install Jenkins:
sudo apt update
sudo apt install openjdk-17-jdk
curl -fsSL https://pkg.jenkins.io/debian-stable/jenkins.io-2023.key | sudo tee \
  /usr/share/keyrings/jenkins-keyring.asc > /dev/null
echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] \
  https://pkg.jenkins.io/debian-stable binary/ | sudo tee \
  /etc/apt/sources.list.d/jenkins.list > /dev/null
sudo apt-get update
sudo apt-get install jenkins -y
  1. Check Jenkins status:
sudo systemctl status jenkins

Start jenkins if not started:

sudo systemctl start jenkins
  1. Allow 8080 jenkins port through firewall (if required):
sudo ufw enable
sudo ufw allow 8080
sudo ufw status
  1. Access Jenkins on 8080 port on server. Run below command to get initial password to setup Jenkins:
sudo cat /var/lib/jenkins/secrets/initialAdminPassword

Setup Kafka on GCP

Switch to project (you can see project ID on top left near search bar):

gcloud config set project <project name>

Connect to instance:

gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name"  --project "project-name"
  1. Install Java:
sudo apt update
sudo apt install openjdk-17-jdk
  1. Copy download link from here and extract:
wget https://downloads.apache.org/kafka/3.5.0/kafka_2.12-3.5.0.tgz
tar -xzf kafka_2.12-3.5.0.tgz
cd kafka_2.12-3.5.0/
  1. Start zookeeper server service:
bin/zookeeper-server-start.sh config/zookeeper.properties
  1. Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name"  --project "project-name"
  1. Start Kafka broker service:
bin/kafka-server-start.sh config/server.properties
  1. Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name"  --project "project-name"
  1. Start Kafka producer to write some events to Kafka topic:
bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092

Type any message in this terminal to send message to this topic.

  1. Open another terminal in same project and connect to instance:
gcloud beta compute ssh --zone "instance_zone_name" "vm-instance-name"  --project "project-name"
  1. Start Kafka consumer to read to messages from topic:
bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092

Allow remote Kafka connection

First make sure the kafka port 9092 is not blocked by firewall. Next go to Kafka folder and config folder. Edit server.properties:

nano server.properties

Look for lines with below code and remove hash from beginning of line:

advertised.listeners=PLAINTEXT://localhost:9092
listeners=PLAINTEXT://0.0.0.0:9092

Update above lines with below:

advertised.listeners=PLAINTEXT://SERVER_IP:9092
listeners=PLAINTEXT://0.0.0.0:9092

Example:

advertised.listeners=PLAINTEXT://35.208.219.6:9092
listeners=PLAINTEXT://0.0.0.0:9092

Example NodeJS code to connect to Kafka:

// install kafkajs package dependency
// producer.js

// import the `Kafka` instance from the kafkajs library
const { Kafka } = require("kafkajs")

// the client ID lets kafka know who's producing the messages
const clientId = "my-app"
// we can define the list of brokers in the cluster
const brokers = ["localhost:9092"] // or SERVER_IP:PORT
// this is the topic to which we want to write messages
const topic = "message-log"

// initialize a new kafka client and initialize a producer from it
const kafka = new Kafka({ clientId, brokers })
const producer = kafka.producer()

// we define an async function that writes a new message each second
const produce = async () => {
	await producer.connect()
	let i = 0

	// after the produce has connected, we start an interval timer
	setInterval(async () => {
		try {
			// send a message to the configured topic with
			// the key and value formed from the current value of `i`
			await producer.send({
				topic,
				messages: [
					{
						key: String(i),
						value: "this is message " + i,
					},
				],
			})

			// if the message is written successfully, log it and increment `i`
			console.log("writes: ", i)
			i++
		} catch (err) {
			console.error("could not write message " + err)
		}
	}, 1000)
}

const consumer = kafka.consumer({ groupId: clientId })

const consume = async () => {
        // first, we wait for the client to connect and subscribe to the given topic
        await consumer.connect()
        await consumer.subscribe({ topic })
        await consumer.run({
                // this function is called every time the consumer gets a new message
                eachMessage: ({ message }) => {
                        // here, we just log the message to the standard output
                        console.log(`received message: ${message.value}`)
                },
        })
}


module.exports = { produce, consume }
// index.js
const kafka = require("./producer")

const { produce, consume } = kafka;


// call the `produce` function and log an error if it occurs
produce().catch((err) => {
	console.error("error in producer: ", err)
})

// start the consumer, and log any errors
consume().catch((err) => {
	console.error("error in consumer: ", err)
})
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment