Add the following library:
libraryDependencies += "org.bouncycastle" % "bcprov-jdk16" % "1.46"
const fs = require("fs"); | |
const { Transform, PassThrough } = require('stream'); | |
function concatStreams(streams) { | |
let pass = new PassThrough(); | |
let waiting = streams.length; | |
for (let stream of streams) { | |
pass = stream.pipe(pass, {end: false}); | |
stream.once('end', () => --waiting === 0 && pass.emit('end')); | |
} |
$ErrorActionPreference = "Stop" | |
Set-StrictMode -Version 3.0 | |
<#-------------------------------------------------------------------------- | |
https://gallery.technet.microsoft.com/scriptcenter/The-PowerShell-script-for-2a2456c4 | |
.SYNOPSIS | |
Script for running T-SQL files in MS SQL Server | |
Andy Mishechkin |
Param( | |
[Parameter(Mandatory = $true)] | |
[string]$websiteName, | |
[Parameter(Mandatory = $true)] | |
[string]$sourceDir, | |
[string]$destinationPath = "/site/wwwroot" | |
) | |
# Usage: .\kuduSiteUpload.ps1 -websiteName mySite -sourceDir C:\Temp\mydir |
/* | |
https://scalafiddle.io/sf/sniohcZ/3 | |
(old version: https://scalafiddle.io/sf/sniohcZ/1) | |
Use PassThroughFlow when you have a message that should be used in a | |
flow that trasform it but you want to maintain the original message for | |
another following flow. | |
For example if you consume messages from Kafka (CommittableMessage). | |
You process the message (transform, save it inside a database, ...) and then you need again the original message | |
to commit the offset. |
date --utc --iso-8601=seconds |
#!/usr/bin/env python3 | |
""" | |
Based on: https://gist.github.com/ottokruse/1c0f79d51cdaf82a3885f9b532df1ce5 | |
Usage: | |
- Save this script somewhere on your path (e.g. `vi /usr/local/bin/aws-console && chmod +x /usr/local/bin/aws-console`) | |
- Install dependencies: pip install boto3 | |
- Make AWS credentials available in one of the usual places where boto3 can find them (~/.aws/credentials, env var, etc.) | |
- Excute the script: `AWS_PROFILE=your-profile aws-console` | |
- :tada: Your browser opens and you are signed in into the AWS console |
# Remove all GIT branches starting with `fix/`, just locally | |
git branch | grep fix/ | xargs -I '{}' git branch -d '{}' |
# Calculate version that will be incremented for each build whenever Major or Minor change, and different for each branch. | |
# Solution based on https://k2vacademy.com/2019/04/03/hidden-gems-in-azure-pipelines-creating-your-own-rev-variable-using-counter-expression-in-azure-pipelines/ | |
# This could be useful if gitversion cannot be used. | |
# Otherwise the best solution is to use instead the default gitversion task: https://github.com/GitTools/actions/blob/main/docs/examples/azure/gitversion/execute/usage-examples.md | |
# .... | |
variables: | |
# Versioning is handled using a format like major.minor.patch{-snapshot} | |
# Major and Minor are fixed, patch is calculated using a counter (it will reset to 0 every time other values change). |
#!/bin/bash | |
# set -e = exit in case of errors | |
# set -u = no undefined variable | |
# set -o pipefail = prevent pipeline errors | |
# more info: https://gist.github.com/usametov/a134115a0fa1157b45ea5d432510d2f6 | |
set -euo pipefail | |
# TODO Add your script |