Skip to content

Instantly share code, notes, and snippets.

@npearce
Last active September 5, 2018 03:28
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save npearce/8a188e9834f64743bd173df2b53df9be to your computer and use it in GitHub Desktop.
Save npearce/8a188e9834f64743bd173df2b53df9be to your computer and use it in GitHub Desktop.
BigStats Setup for: F5 BIG-IP -> Apache Kafka message bus

Create the Apache Kafka Container for BigStats on AWS

Updated: August 21, 2018

NOTE: This setup is used as an exporter destination for BigStats: https://npearce.github.io

Create the Instance

On the AWS Console:

  1. In 'Instances', click 'Luanch Instance'.
  2. Select 'Amazon Linux 2 AMI (HVM), SSD Volume Type'
  3. Select 't2.medium' (perfectly fine for lab testing), and click 'Next: Configure Instance Details'
  4. Select the appropriate 'Network' and 'subnet' for your environment that can reach your BIG-IP mangement interface. Click 'Review and Launch'.
  5. Apply the correct Securty Group to access:
    • Kafka 9092/tcp
    • SSH to the docker host: 22/tcp
  6. Click 'Launch'
  7. Select the appropriate key pair you have access to, click 'Launch Instances'.
  8. Click 'View Instances' to be taken to the newly reated instance and watch it boot!

Install Docker

Once the 'Status Checks' transition from Initializing to '2/2 checks passed':

  1. [OPTIONAL] Give the new instance a name.
  2. Select the instance and click 'Connect' to access the connection details.
  3. SSH into the new instace, e.g. ssh -i "YourAWSKey.pem" ec2-user@ec2-11-22-33-44.us-west-1.compute.amazonaws.com
  4. Update the package cache: sudo yum update -y
  5. Install the most recent Docker Community Edition package: sudo yum install -y docker
  6. Start the Docker service: sudo service docker start
  7. Add the ec2-user to the docker group: sudo usermod -a -G docker ec2-user
  8. Log out and log back in again to pick up the new docker group permissions.
  9. Verify that the ec2-user can run Docker commands without sudo: docker info

Note: In some cases, you may need to reboot your instance to provide permissions for the ec2-user to access the Docker daemon. Try rebooting your instance if you see the following error: Cannot connect to the Docker daemon. Is the docker daemon running on this host?

Install docker-compose

Install docker-compose to build the kafka proxy cluster from a single definition file.

  1. Download docker-compose: sudo curl -L https://github.com/docker/compose/releases/download/1.22.0/docker-compose-$(uname -s)-$(uname -m) -o /usr/local/bin/docker-compose
  2. Fix permissions after download: sudo chmod +x /usr/local/bin/docker-compose
  3. Verify by executing: docker-compose version

Pull the repo

  1. ssh onto the Docker Host and install git: sudo yum install -y git
  2. Clone the wurstmeister/kafka-docker repository: git clone https://github.com/wurstmeister/kafka-docker
  3. Change to the repo directory: cd kafka-docker/
  4. [For a 'single kafka broker' environment] Edit the file docker-compose-single-broker.yml and change the KAFKA_ADVERTISED_HOST_NAME value.

NOTE: Use the hostname or IP of the current docker host, Not localhost.

  1. Start kafka: docker-compose -f docker-compose-single-broker.yml up -d

You should see something like:

Starting kafka-docker_kafka_1     ... done
Starting kafka-docker_zookeeper_1 ... done

docker-compose has created two new containers for your single broker kafka environment.

  1. On the docker host, execute: docker ps and you should see a status of 'UP' for both.

NOTE: A t2.micro instance may not have enough resource and you may have trouble keeping the containers runnning.

Interact with Apache Kafka

  1. Open a shell prompt to the kafka container: docker exec -it kafka-docker_kafka_1 sh
  2. To list the kafka topics known to this broker, execute: kafka-topics.sh --list --zookeeper zookeeper

You should see two default topis:

topic1
topic2
  1. View message inside a topic by executing: kafka-simple-consumer-shell.sh --broker-list localhost:9092 --topic topic1
  2. Delete a topic: kafka-topics.sh --zookeeper zookeeper --delete --topic topic2

NOTE: There's not much to see right now. However, you are ready to configure BigStats to send data to the Kafka Broker.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment