Skip to content

Instantly share code, notes, and snippets.

View cmaggiulli's full-sized avatar

Chris Maggiulli cmaggiulli

View GitHub Profile
@cmaggiulli
cmaggiulli / canvas_data_dap_custom_download_names.py
Created February 10, 2024 21:35 — forked from hanleybrand/canvas_data_dap_custom_download_names.py
Importing instructure-dap-client and using it programatically - quickstart example scripts
import os
from pathlib import Path
import shutil
from urllib.parse import urlparse
from dap.api import DAPClient
from dap.dap_types import Format, IncrementalQuery, SnapshotQuery
import requests
output_dir_base = Path("downloads")
@cmaggiulli
cmaggiulli / README.md
Created November 16, 2022 01:43 — forked from magnetikonline/README.md
NSSM - the Non-Sucking Service Manager cheatsheet.
@cmaggiulli
cmaggiulli / README.md
Created November 16, 2022 01:43 — forked from magnetikonline/README.md
NSSM - the Non-Sucking Service Manager cheatsheet.
@cmaggiulli
cmaggiulli / boto_session.py
Created October 2, 2022 01:06 — forked from pritul95/boto_session.py
Refreshable Boto3 Session to create auto refreshable client or resource
from uuid import uuid4
from datetime import datetime
from time import time
import boto3
from boto3 import Session
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
@cmaggiulli
cmaggiulli / git-cherry-pick.txt
Created September 20, 2022 16:31 — forked from ViktorAndonov/git-cherry-pick.txt
How to cherry pick commits
0.1) Use the terminal;
1) Make sure there is no changes in both branches (git status);
2) Checkout the brach that you want to take the commit from;
3) Use git log to show the full number of the commit that you want to cherry pick, and copy it from the teminal;
4) Now checkout the branch that you want to recive;
5) Then write: git cherry-pick [the number of the commit that you want to take, without brackets]
6) You are done! or if conflicts appear resolve them manually;
7) Then commit the changes;
@cmaggiulli
cmaggiulli / README.md
Created September 16, 2022 08:07 — forked from MattFanto/README.md
Django appenddata

Alternative command to django loaddata when it is necessary to append fixture objects into an existing database. Instead of merging fixture data with your existing models (as it does loaddata) it appends all fixtures object by resetting all pk. M2M relations are managed at the end of the process, mapping the old primary keys with the new primary keys:

Example of test (appending data from Website2 into Website1):

# Website 1
python manage.py dumpdata app1 app2 ... > test_append_data_fixtures_pre.json

# Website 1
python manage.py dumpdata app1 app2 ... > fixture_to_import.json
@cmaggiulli
cmaggiulli / keep-jenkins-plugins-uptodate-with-no-warnings.groovy
Last active April 3, 2022 05:35 — forked from alecharp/keep-jenkins-plugins-uptodate.groovy
Simple groovy script to upgrade active plugins when new versions are available ( without warnings )
jenkins.model.Jenkins.getInstance().getUpdateCenter().getSites().findAll{ !it.data.warnings }.each { site ->
site.updateDirectlyNow(hudson.model.DownloadService.signatureCheck)
}
hudson.model.DownloadService.Downloadable.all().each { downloadable ->
downloadable.updateNow();
}
def plugins = jenkins.model.Jenkins.instance.pluginManager.activePlugins.findAll {
it -> it.hasUpdate()
PROJECT_DIR=/home/pentaho/projects
KETTLE_REDIRECT_STDERR=Y
KETTLE_REDIRECT_STDOUT=Y
KETTLE_MAX_JOB_TRACKER_SIZE=5000
KETTLE_MAX_LOGGING_REGISTRY_SIZE=10000
KETTLE_MAX_LOG_SIZE_IN_LINES=
KETTLE_LOG_MARK_MAPPINGS=Y
PROJECT_DIR=/home/pentaho/projects
KETTLE_REDIRECT_STDERR=Y
KETTLE_REDIRECT_STDOUT=Y
KETTLE_MAX_LOGGING_REGISTRY_SIZE=10000
KETTLE_LOG_MARK_MAPPINGS=Y
KETTLE_JOB_LOG_SCHEMA=pentaho_dilogs
KETTLE_JOB_LOG_DB=live_logging_info
KETTLE_JOB_LOG_TABLE=job_logs
@cmaggiulli
cmaggiulli / mysql-faster-imports.sh
Created April 29, 2019 23:05 — forked from OZZlE/mysql-faster-imports.sh
Linux Bash Script to toggle faster mysql db imports
#!/usr/bin/env bash
# USAGE: mysqlOptimizeForImports <- before importing
# mysqlDefaultSettings <- to go back to normal
# Based on https://dba.stackexchange.com/questions/83125/mysql-any-way-to-import-a-huge-32-gb-sql-dump-faster/83385#83385
mysqlStateFile="$HOME/mysql.optimized.for.exports"
mysqlConfigLocation="/etc/mysql/my.cnf" # <-- change to the correct for your system, should be for global mysql settings