Skip to content

Instantly share code, notes, and snippets.

View MBKraus's full-sized avatar

Mike Kraus MBKraus

  • Mollie
  • Amsterdam, the Netherlands
  • 22:59 (UTC +02:00)
View GitHub Profile
@MBKraus
MBKraus / ckad_bookmarks_2022.html
Last active March 4, 2022 11:48
Bookmarks for the Certified Kubernetes Application Developer (CKAD) certification exam (dated 03/2022)
<!DOCTYPE NETSCAPE-Bookmark-file-1>
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=UTF-8">
<TITLE>Bookmarks</TITLE>
<H1>Bookmarks</H1>
<DL><p>
<DL><p>
<DT><H3>Kubernetes</H3>
<DL><p>
<DT><A HREF="https://kubernetes.io/docs/tasks/inject-data-application/define-command-argument-container/" ADD_DATE="1644835127" ICON="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAADbElEQVQ4jW2TS2xUdRjFz/f/35k7z86LAcaWBlDSEtBWSgUbUiIYTRtQ6yNRHkZDJBFNBMSNBq0kuNCYaAl1IbJwYcBqFETFKgmMEbAxausQpdQOpsVhOoxt58773v/9XNQ2LPytvvMl56zOAf6DmeXcDRBPDT2WL1tfF8rWCc79/ABugpkF/g/uhsbTg49nctap9/sNs7N7krsOTvNH5/KVXNH8jHO/dTFAN3vmBOevbM1Yi3d8esG85+SAhd+Thr0oKm1lM1JZJZsbgvTgXcBDba5vAtbIB+Rv7JsLYC7v6/tRf+vtvkmkMkUV8hFfy1ryyXt9KFcZfd8XEQtLNVVgWnqLV+59JIjNd1aeJ+k6TPxVh/732s8vdHYXmxb4y1bFJG0qr2j7Ri/aV7qgFPDtryUcO1fEwqBkm9k0TI/zi/3us7Ee530amncvSSTNJUa+RE31mmNDkwsNdU4MDJfpH0OhXGXMDwi891yYR9MWnfml5Lg4XMLQqFwe23Y4JuCrXZOcIB/B5rNDFXjdktc2OimeqPDEtM1jNxQPDFe5/XYXMTPil6oAK4ykOYzI6hYNet3KkbSmmYrN
@MBKraus
MBKraus / main-CI.yaml
Last active December 10, 2021 14:38
CI - part III
- stage: Deployment
displayName: Deployment stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: '$(environmentName)'
@MBKraus
MBKraus / main-CI.yaml
Last active December 10, 2021 14:58
CI - Part II
stages:
- stage: Build
displayName: Build
jobs:
- job: DockerImage
displayName: Build and push Docker image
steps:
- task: Docker@1
@MBKraus
MBKraus / main-CI.yaml
Last active December 10, 2021 14:39
CI - Part I
name: main-CI
trigger:
- main
pool:
vmImage: ubuntu-20.04
variables:
azureServiceConnection: <Azure Service Connection name>
@MBKraus
MBKraus / Dockerfile
Created December 8, 2021 19:55
Dockerfile for the Docker image of the Azure Function
FROM mcr.microsoft.com/azure-functions/python:3.0-python3.8-slim
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
COPY . /home/site/wwwroot
# build variables.
ENV DEBIAN_FRONTEND noninteractive
@MBKraus
MBKraus / __init__.py
Last active December 10, 2021 14:00
__init__.py of our Azure Function
import os
import pandas as pd
import azure.functions as func
from azure.storage.blob import ContainerClient, BlobClient
import pyodbc
SERVER = os.environ.get("DB_SERVER")
DB_NAME = os.environ.get("DB_NAME")
USERNAME = os.environ.get("DB_USERNAME")
PASSWORD = os.environ.get("DB_PASSWORD")
@MBKraus
MBKraus / subtract_gaussian_filter.py
Created September 4, 2020 09:48
Subtract Gaussian filter
def subtract_Gaussian_filter(image, ksize=0):
"""Convolve with Gaussian filter and subtract from original."""
image = np.uint8(image)
im_rescaled = cv2.addWeighted(
image, 4, cv2.GaussianBlur(image, (ksize, ksize), image.shape[0] / 30.0), -4, 128)
im_rescaled = im_rescaled.reshape(image.shape)
return im_rescaled
@MBKraus
MBKraus / update_functions.py
Last active November 5, 2019 15:44
Periodically updates the model
import keras
from keras.models import Sequential, load_model
import os
import time
import logging
import mlflow.keras
def load_current_model(model_path, file_m):
@MBKraus
MBKraus / update_DAG.py
Last active November 5, 2019 20:06
DAG that updates the model on a daily basis
import airflow
from airflow.models import DAG
from airflow.operators.python_operator import PythonOperator
from src.data.data_functions import get_data_from_kafka, load_data
from src.models.update_functions import load_current_model, update_model, data_to_archive
from src.preprocessing.preprocessing_functions import preprocessing
CLIENT = 'kafka:9092'
TOPIC = 'TopicA'
@MBKraus
MBKraus / data_functions.py
Last active November 5, 2019 15:47
Data functions for fetching data from Kafka, putting it in the right format and storing it
from kafka import KafkaConsumer, TopicPartition
from json import loads
import numpy as np
import time
import pickle
import os
import logging
def decode_json(jsons_comb):