Skip to content

Instantly share code, notes, and snippets.

Avatar

Robin Moffatt rmoff

View GitHub Profile
@rmoff
rmoff / jmeter_ibmmq.jmx
Last active Oct 16, 2020
IBM MQ JMeter test script
View jmeter_ibmmq.jmx
<?xml version="1.0" encoding="UTF-8"?>
<jmeterTestPlan version="1.2" properties="4.0" jmeter="4.0 r1823414">
<hashTree>
<TestPlan guiclass="TestPlanGui" testclass="TestPlan" testname="Test Plan" enabled="true">
<stringProp name="TestPlan.comments"></stringProp>
<boolProp name="TestPlan.functional_mode">false</boolProp>
<boolProp name="TestPlan.tearDown_on_shutdown">true</boolProp>
<boolProp name="TestPlan.serialize_threadgroups">false</boolProp>
<elementProp name="TestPlan.user_defined_variables" elementType="Arguments" guiclass="ArgumentsPanel" testclass="Arguments" testname="User Defined Variables" enabled="true">
<collectionProp name="Arguments.arguments">
@rmoff
rmoff / docker-compose.yml
Last active Oct 12, 2020
Multi-node Kafka cluster (three brokers)
View docker-compose.yml
---
version: '3.8'
services:
zookeeper-1:
image: confluentinc/cp-zookeeper:5.5.1
ports:
- '32181:32181'
environment:
ZOOKEEPER_CLIENT_PORT: 32181
ZOOKEEPER_TICK_TIME: 2000
@rmoff
rmoff / gist:179ed4067b9f042344cf597286ac1840
Created Sep 14, 2017
Example of Using Kafka Single Message Transform TimestampConverter
View gist:179ed4067b9f042344cf597286ac1840
"_comment": "Use SMT to cast op_ts and current_ts to timestamp datatype (TimestampConverter is Kafka >=0.11 / Confluent Platform >=3.3). Format from https://docs.oracle.com/javase/8/docs/api/java/text/SimpleDateFormat.html",
"transforms": "convert_op_ts,convert_current_ts",
"transforms.convert_op_ts.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.convert_op_ts.target.type": "Timestamp",
"transforms.convert_op_ts.field": "current_ts",
"transforms.convert_op_ts.format": "yyyy-MM-dd HH:mm:ss.SSSSSS",
"transforms.convert_current_ts.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.convert_current_ts.target.type": "Timestamp",
"transforms.convert_current_ts.field": "op_ts",
"transforms.convert_current_ts.format": "yyyy-MM-dd HH:mm:ss.SSSSSS"
@rmoff
rmoff / List all available Kafka Connect plugins.md
Created May 18, 2018
List all available Kafka Connect plugins
View List all available Kafka Connect plugins.md
$ curl -s -XGET http://localhost:8083/connector-plugins|jq '.[].class'

"io.confluent.connect.activemq.ActiveMQSourceConnector"
"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector"
"io.confluent.connect.hdfs.HdfsSinkConnector"
"io.confluent.connect.hdfs.tools.SchemaSourceConnector"
"io.confluent.connect.ibm.mq.IbmMQSourceConnector"
"io.confluent.connect.jdbc.JdbcSinkConnector"
"io.confluent.connect.jdbc.JdbcSourceConnector"
"io.confluent.connect.jms.JmsSourceConnector"
@rmoff
rmoff / 01_Spark+Streaming+Kafka+Twitter.ipynb
Last active Sep 17, 2020
Simple example of processing twitter JSON payload from a Kafka stream with Spark Streaming in Python
View 01_Spark+Streaming+Kafka+Twitter.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@rmoff
rmoff / 00.README.md
Last active Sep 11, 2020
RMarkdown - repeating child block with changing variable value
View 00.README.md

When you use knit_expand it appears that the inclusion of the Rmd is done on the first pass, and then the complete document evaluated. This means that a Rmd block referenced in loop with knit_expand will only evaluate changing variables at their last value.

This can be worked around by passing the literal value of the variable at the time of the knit_expand with {{var}} syntax.

This is documented in the knitr_expand docs, but less clear (to an R noob like me) for embedded documents rather than strings.

@rmoff
rmoff / gist:379e6ce46eb128110f38
Last active Aug 14, 2020
Kibana 3 and Elasticsearch 1.4 - CORS problem
View gist:379e6ce46eb128110f38

Kibana 3 against ElasticSearch 1.4 throws an Connection Failed screen. The error text says to set http.cors.allow-origin, but it misses out the important http.cors.enabled: true

Working config:

$ grep cors elasticsearch-1.4.0.Beta1/config/elasticsearch.yml
http.cors.allow-origin: "/.*/"
http.cors.enabled: true
@rmoff
rmoff / gist:f32543f78d821b25502f6db49eee9259
Created Aug 29, 2017
Kafka Connect JDBC source with JSON converter
View gist:f32543f78d821b25502f6db49eee9259
{
"name": "jdbc_source_mysql_foobar_01",
"config": {
"_comment": "The JDBC connector class. Don't change this if you want to use the JDBC Source.",
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"_comment": "How to serialise the value of keys - here use the Json converter. We want to retain the schema in the message (which will generate a schema/payload JSON document) and so set schemas.enable=true",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable":"true",
@rmoff
rmoff / gist:9687727
Last active Jul 21, 2020
XRDP - connect using any color (colour) depth
View gist:9687727

XRDP - connect using any color (colour) depth

XRDP version

xrdp 0.5.0

Problem

When connecting from mstsc or other RPD clients, unless specifying 24-bit colour depth (Millions), sesman fails. The Connection log just shows "error - problem connecting".

You can’t perform that action at this time.