Skip to content

Instantly share code, notes, and snippets.

View ninadingole's full-sized avatar
✈️
Looking for next crazy idea to build

Neenad Ingole ninadingole

✈️
Looking for next crazy idea to build
View GitHub Profile
name: "My CodeQL config"
disable-default-queries: true
languages: java
queries:
- name: Extended Security
uses: security-extended
- name: Security and Quality
name: "CodeQL"
on:
push:
branches: [master, ]
pull_request:
# The branches below must be a subset of the branches above
branches: [master]
# schedule:
# - cron: '0 3 * * 4'

Keybase proof

I hereby claim:

  • I am ninadingole on github.
  • I am iamninad (https://keybase.io/iamninad) on keybase.
  • I have a public key ASDQ_lKHv4aZIwHqiD5wWJOCHV4euhaPoJGdYPa3Anq_hwo

To claim this, I am signing this object:

@ninadingole
ninadingole / cloudSettings
Last active May 6, 2020 12:15
settings-sync
{"lastUpload":"2020-05-06T12:15:46.434Z","extensionVersion":"v3.4.3"}
package internal
import (
"context"
"encoding/json"
"errors"
"fmt"
"log"
"net/http"
"os"
@ninadingole
ninadingole / sbt.scala
Last active March 17, 2019 13:21
Organising Scala Tests In SBT
#########################################
#### Inside project/E2E.class object ####
#########################################
import sbt.{Configuration, Defaults, Test, inConfig}
import sbt._
import sbt.Keys._
object E2E {
final val E2ETest = Configuration.of("EndToEndTest", "e2e") extend (Test)
REPLICAT rkafka
TARGETDB LIBFILE libggjava.so SET property=dirprm/rkafka_handler.props
REPORTCOUNT EVERY 1 MINUTES, RATE
GROUPTRANSOPS 1000
MAP *.*.*, TARGET *.*.*;
gg.handlerlist=kafkaconnect
#The handler properties
gg.handler.kafkaconnect.type=kafkaconnect
gg.handler.kafkaconnect.kafkaProducerConfigFile=kafkaconnect.properties
# See http://docs.oracle.com/goldengate/bd123110/gg-bd/GADBD/using-kafka-connect-handler.htm#GUID-A87CAFFA-DACF-43A0-8C6C-5C64B578D606
gg.handler.kafkaconnect.topicMappingTemplate=ora-ogg-${schemaName}-${tableName}-avro
gg.handler.kafkaconnect.keyMappingTemplate=${primaryKeys}
bootstrap.servers=localhost:9092
key.converter=io.confluent.connect.avro.AvroConverter
value.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter.schema.registry.url=http://localhost:8081
# For Json
#value.converter=org.apache.kafka.connect.json.JsonConverter
#key.converter=org.apache.kafka.connect.json.JsonConverter
def buildEventStream = {
import AppSerdes.movieBEventSerde.eventConsumed
builder.stream[Int, BusinessEvent]("events")
}
private val eventStreams: KStreamS[Int, BusinessEvent] = buildEventStream
def filterEventsByType(eventType: String): KStreamS[Int, BusinessEvent] = {
eventStreams.filter((_: Int, event: BusinessEvent) => event.eventType.equalsIgnoreCase(eventType))
}