Skip to content

Instantly share code, notes, and snippets.

@yaauie
Last active June 13, 2024 00:19
Show Gist options
  • Save yaauie/c68f56ef24eed952f9401e09e9399d98 to your computer and use it in GitHub Desktop.
Save yaauie/c68f56ef24eed952f9401e09e9399d98 to your computer and use it in GitHub Desktop.

Setup

First, I created a simplified pipeline using a generator input to produce exactly one event whose message is the value of an environment variable, and a cipher filter that similarly uses a key from an environment variable:

input {
  generator {
    count => 1
    message => "${CIPHERTEXT}"
  }
}

filter {
  cipher {
    algorithm => "aes-256-cbc"
    iv_random_length => 16
    key => "${KEY}"
    key_size => 32
    mode => "decrypt"
    source => "message"
    target => "decrypted_message"
  }
}

output {
  stdout {
    codec => json_lines
  }
}

Next, I created a file with a message that included a timestamp

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ echo "this is a test, let's see if it works $(date)" > testfile_v12.txt
[success]

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ cat testfile_v12.txt
this is a test, let's see if it works Wed Jun 12 23:37:13 UTC 2024
[success]

Then, I exported a KEY environment variable containing a 32-byte key. Although it would be much more secure to have a non-ascii key, the cipher filter doesn't have a way to provide it with an encoded key so we are limited to unicode.

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ export KEY="test1234test1234test1234test1234"
[success]

Producing an encrypted payload

Then I did the complicated bit of producing the encrypted payload:

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ (iv="$(head -c 16 /dev/urandom)"; iv_hex="$(print -n "${iv}" | xxd -p)"; >&2 echo "IV_HEX=${iv_hex}"; key_hex="$(print -n "${KEY}" | xxd -p | tr -d '\n')"; >&2 echo "KEY_HEX=${key_hex}";ciphertext="$(openssl enc -aes-256-cbc -salt -pbkdf2 -iv "${iv_hex}" -in testfile_v12.txt -K "${key_hex}")"; print -n "${iv}${ciphertext}" | base64) > testfile_v12.enc
IV_HEX=b6ed9c1b887f55a537f5ff1159246141
KEY_HEX=7465737431323334746573743132333474657374313233347465737431323334
[success]

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ cat testfile_v12.enc
tu2cG4h/VaU39f8RWSRhQW1NSHYvFCSSdjVGAhiynLXXeAilCQNsFkQ56rWuu/el5nN7HBnlSuT9kEurALyi7M0EEWgnfNk07tJYKcpJtcGDgMrXCIxQNxo+4HCezy4Q
[success]

To break that down:

  • (: open a subshell to avoid leaking variables
  • iv="$(head -c 16 /dev/urandom)": create an iv shell variable with 16 random bytes
  • iv_hex="$(print -n "${iv}" | xxd -p)": encode those bytes in hexadecimal format (note xxd is not POSIX-sandard, but is available anywhere vim is installed; there are other ways to get a hex dump)
  • >&2 echo "IV_HEX=${iv_hex}": print the hex-encoded iv to stderr for debugging
  • key_hex="$(print -n "${KEY}" | xxd -p | tr -d '\n')": encode those bytes in hexadecimal format, stripping newlines
  • >&2 echo "KEY_HEX=${key_hex}": print the hex-encoded key to stderr for debugging
  • ciphertext="$(openssl enc -aes-256-cbc -salt -pbkdf2 -K "${key_hex}" -iv "${iv_hex}" -in testfile_v12.txt)": create a ciphertext local variable containing the output of the openssl enc command, notably:
    • -K "${key_hex}": use the provided hex-encoded key; we cannot use the -k option, because it accepts a password as input from which it will produce a key and the cipher filter for logstash requires an actual key.
    • -iv "${iv_hex}": use the provided hex-encoded iv
    • -salt: not necessary as it is enabled by default
  • print -n "${iv}${ciphertext}" | base64: base64-encode the iv+ciphertext pair
  • ) > testfile_v12.enc: redirect the output of the subshell to file

I then also validated that the first 16 bytes of the resulting file after decoding the base64 wrapper were identical to the iv:

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ cat testfile_v12.enc| base64 --decode | head -c 16 | xxd -p
b6ed9c1b887f55a537f5ff1159246141
[success]

Validating Logstash Cipher Filter

Then I invoked Logstash and observed that the base64-wrapped iv+ciphertext pair was properly decoded by the Cipher Filter:

╭─{ rye@perhaps:~/src/elastic/scratch/202406-cipher-filter }
╰─○ CIPHERTEXT="$(cat testfile_v12.enc)" logstash-8.14.1/bin/logstash -f "${PWD}/pipeline.conf"
Using bundled JDK: /Users/rye/src/elastic/scratch/202406-cipher-filter/logstash-8.14.1/jdk.app/Contents/Home
Sending Logstash logs to /Users/rye/src/elastic/scratch/202406-cipher-filter/logstash-8.14.1/logs which is now configured via log4j2.properties
[2024-06-12T23:40:10,115][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/rye/src/elastic/scratch/202406-cipher-filter/logstash-8.14.1/config/log4j2.properties
[2024-06-12T23:40:10,117][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.14.1", "jruby.version"=>"jruby 9.4.7.0 (3.1.4) 2024-04-29 597ff08ac1 OpenJDK 64-Bit Server VM 17.0.11+9 on 17.0.11+9 +indy +jit [arm64-darwin]"}
[2024-06-12T23:40:10,118][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-06-12T23:40:10,119][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-06-12T23:40:10,119][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-06-12T23:40:10,132][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-06-12T23:40:10,313][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9601, :ssl_enabled=>false}
[2024-06-12T23:40:10,389][INFO ][org.reflections.Reflections] Reflections took 32 ms to scan 1 urls, producing 132 keys and 468 values
[2024-06-12T23:40:10,426][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-06-12T23:40:10,431][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-12T23:40:10,438][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/Users/rye/src/elastic/scratch/202406-cipher-filter/pipeline.conf"], :thread=>"#<Thread:0x7d116daa /Users/rye/src/elastic/scratch/202406-cipher-filter/logstash-8.14.1/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-12T23:40:10,625][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.19}
[2024-06-12T23:40:10,627][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
{"@timestamp":"2024-06-12T23:40:10.631066Z","message":"tu2cG4h/VaU39f8RWSRhQW1NSHYvFCSSdjVGAhiynLXXeAilCQNsFkQ56rWuu/el5nN7HBnlSuT9kEurALyi7M0EEWgnfNk07tJYKcpJtcGDgMrXCIxQNxo+4HCezy4Q","event":{"sequence":0,"original":"tu2cG4h/VaU39f8RWSRhQW1NSHYvFCSSdjVGAhiynLXXeAilCQNsFkQ56rWuu/el5nN7HBnlSuT9kEurALyi7M0EEWgnfNk07tJYKcpJtcGDgMrXCIxQNxo+4HCezy4Q"},"decrypted_message":"this is a test, let's see if it works Wed Jun 12 23:37:13 UTC 2024\n","@version":"1","host":{"name":"perhaps"}}
[2024-06-12T23:40:10,819][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-06-12T23:40:10,825][INFO ][logstash.agent           ] Pipelines running {:count=>0, :running_pipelines=>[], :non_running_pipelines=>[:main]}
[2024-06-12T23:40:10,826][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-06-12T23:40:10,828][INFO ][logstash.runner          ] Logstash shut down.
[success (00:00:05)]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment