Skip to content

Instantly share code, notes, and snippets.

View alexcombessie's full-sized avatar
🤓
Building @Giskard-AI - Open-Source CI/CD for ML products

Alex Combessie alexcombessie

🤓
Building @Giskard-AI - Open-Source CI/CD for ML products
View GitHub Profile
@alexcombessie
alexcombessie / install_giskard.sh
Created June 6, 2023 18:12
Install Giskard 2.0 Beta
pip install giskard==2.0.0b2
"""
Summary: Tests if the model prediction is invariant when the feature values are perturbed
Description: Test if the predicted classification label remains the same after
feature values perturbation.The test is passed when the percentage of unchanged
rows is higher than the threshold
Args:
df(GiskardDataset):
@alexcombessie
alexcombessie / giskard_ml_worker_start.sh
Created November 23, 2022 15:26
How to start Giskard ML worker
giskard worker start -h <giskard server ip address>
@alexcombessie
alexcombessie / giskard_dev_setup.sh
Created August 11, 2022 11:11
Giskard Dev Setup on M1 Mac
arch -x86_64 zsh
./gradlew build --parallel
./gradlew -Pprod bootJar
java -jar giskard-server/build/libs/giskard-server.jar
cd giskard-ml-worker && PYTHONPATH=generated .venv/bin/python main.py
cd giskard-frontend && npm run serve
git clone https://github.com/Giskard-AI/giskard.git
cd giskard
docker-compose up -d
from keras.layers import Input, Dense, LSTM, Reshape
from keras.models import Model
# Define the keras architecture of your model in 'build_model' and return it. Compilation must be done in 'compile_model'.
# input_shapes - dictionary of shapes per input as defined in features handling
# n_classes - For classification, number of target classes
def build_model(input_shapes, n_classes=None):
# This input will receive all the preprocessed features
window_size = 30
# -------------------------------------------------------------------------------- NOTEBOOK-CELL: CODE
# -*- coding: utf-8 -*-
import dataiku
import pandas as pd, numpy as np
import os
# Read recipe inputs
positive_python = dataiku.Dataset("positive_python")
df = positive_python.get_dataframe()
# Write recipe outputs
@alexcombessie
alexcombessie / dataiku_dss_training_deployer.py
Created April 20, 2018 17:30
Deploying training projects on Dataiku DSS
# Parameters
number_of_groups = 5
number_of_users_per_group = 3
generic_password = "password4Training!!"
analysis_id, mltask_id = u'bOkH96fi', u'E7bXC2fR'
saved_model_id="sXGr0kU2"
# Group creation
import dataiku
@alexcombessie
alexcombessie / custom_trigger_all_sql_dataset_change.py
Created February 14, 2018 17:12
[Dataiku scenario] Custom trigger to check if all SQL datasets have changed
# LIBRARY LOADING
from dataiku.scenario import Trigger
import dataiku
import pandas as pd
from dataiku.core.sql import SQLExecutor2
import json
# USER INPUT - CHANGE THE DICTIONARY BELOW
## This should be a dictionary of KEY: VALUE
## with KEY being names of datasets you want to monitor
@alexcombessie
alexcombessie / predictive_API_deployment.py
Last active December 7, 2017 13:47
Deploy Dataiku predictive API package from Design to API node
# -*- coding: utf-8 -*-
# Author = COMBESSIE, Alexandre (Dataiku)
## How to deploy a DSS API package from Design or Automation to API node
## This script is designed to be run in a DSS Design or Automation node
## It can run on an external system with a Dataiku API client token
## For details on how to query the deployed API endpoint,
## check https://doc.dataiku.com/dss/api/4.1/apinode-user/
## Note that for Python/R function endpoints the URL should end by /run instead of /predict
# LIBRARIES