Skip to content

Instantly share code, notes, and snippets.

View gaborgsomogyi's full-sized avatar

Gabor Somogyi gaborgsomogyi

View GitHub Profile

OS version: MacOS Calatina 10.15.3

Setup

REALM=EXAMPLE.COM
KDC_KADMIN_SERVER=$(ipconfig getifaddr en0)
CUSTOM_PRINCIPAL=user/example.com
CUSTOM_PRINCIPAL_PASSWORD=user
CUSTOM_KEYTAB_PATH=$HOME/user.keytab

Installation

  1. Create a managed Directory Service
  2. Launched Microsoft Windows Server 2019 with SQL Server 2019 Enterprise - ami-08d76971476b11b4f
    The following IAM policy roles are needed to connect to the AD properly:
    • AmazonEC2FullAccess
    • AmazonSSMManagedInstanceCore
    • AmazonSSMDirectoryServiceAccess

Make sure during creation join to the existing domain dropdown set properly.

To read what kind of authentication scheme is used:

sqlcmd -S example.com -U sa -P Mssql123 -Q "SELECT auth_scheme FROM sys.dm_exec_connections WHERE session_id = @@spid"
  • SQL: When SQL Server authentication is used
  • NTLM: When NTLM authentication is used
  • KERBEROS: When KERBEROS authentication is used

To read what kind of authentication method is used:

sqlcmd -S example.com -U sa -P Mssql123 -Q "EXEC xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'LoginMode'"

LoginMode values:

  • 1: Windows Authentication
  • 2: SQL Server and Windows Authentication mode

To set it:

# Setup minikdc
cat << 'EOF' > minikdc_deps.gradle
apply plugin: 'java'

repositories {
   mavenCentral()
}

dependencies {
cd external/docker-integration-tests
mvn install -DskipTests -Dscalastyle.skip=true -Dcheckstyle.skip
  • Go to IntelliJ and recompile docker-integration-tests project.
  • Start test from IntelliJ
class KafkaSinkBatchSuiteV2
...
  test("single node batch") {
    val topic = newTopic()
    testUtils.createTopic(topic)
    val rand = new Random()
    val data = Seq.fill(100000)(Row(topic, rand.nextInt().toString))

 val df = spark.createDataFrame(
class KafkaSinkStreamingSuite
...
  test("single node streaming") {
    val input = MemoryStream[String]
    val topic = newTopic()
    testUtils.createTopic(topic)

    val writer = createKafkaWriter(
 input.toDF(),
git config --global merge.tool meld
git config --global diff.tool meld
git config --global mergetool.meld.path "C:\Program Files (x86)\Meld\meld\meld.exe"