Skip to content

Instantly share code, notes, and snippets.

@chazlarson
chazlarson / stop_4k_transcodes.py
Created May 6, 2025 04:04 — forked from tobiasglen/stop_4k_transcodes.py
Stop-Emby-transcodes-automatically
# this requires your content naming scheme includes the resolution in the file name
# --> (Show Title - s01e02 - Episode Name - WEBDL-2160p - (h265 EAC3 Atmos) - GROUP.mkv)
# the script will then first check locally to make sure the ffmpeg transcode process is transcoding video (audio and container is allowed)
# and makes sure "2160p" is in the file name, if both cases are true then the script will kill the process and show the user an error message
import os
import re
import requests
import time
import logging
@chazlarson
chazlarson / share_unshare_libraries.py
Last active September 16, 2022 20:54 — forked from JonnyWong16/share_unshare_libraries.py
Automatically share and unshare libraries for Plex users
# Run this script using "share" or "unshare" as arguments:
# To share the Plex libraries:
# python share_unshare_libraries.py share
# To unshare the Plex libraries:
# python share_unshare_libraries.py unshare
import requests
import sys
from xml.dom import minidom
@chazlarson
chazlarson / plex_cleanup.rb
Created April 20, 2022 17:52 — forked from mblythe86/plex_cleanup.rb
Clean unused posters/art/banners from Plex
#!/usr/bin/env ruby
# At least for movies, it looks like it copies the 'selected' stuff into the
# _stored/ folder. It also seems to use the _combined/ folder for access.
# so, I shoud be able to delete anything that's not those two.
# And after that, in the _combined folder, I can/should delete any broken links
# and remove them from <art> and <posters> in Info.xml.
# remove <reviews> from Info.xml too, since it seems to be unused.
# As far as I can tell, extras.xml can just be deleted?
# Same for Artists
@chazlarson
chazlarson / sb_gd.sh
Last active November 9, 2021 23:33
script to do some Google Drive stuff. This script will install safire, create 3 projects, create 3 shared drives, create 100 Service Accounts in each project, download all the SA JSON files, add all those SAs to a Google Group, add that group to all three Shared Drives, then create four rclone remotes [one for each SD, one union]
#!/bin/bash
# Assumptions:
# 1. You have created a google project as described here: https://docs.saltbox.dev/reference/google-project-setup/
# 2. You have the credential JSON to hand
# 3. You have created a google group as described here: https://docs.saltbox.dev/reference/google-group-setup/
# 4. You have that group address to hand
# 5. You have rclone installed
# 6. You are running python 3.8 and have run sudo apt install python3.8-venv -y
# Probably other python3 works, the assumption is that I can create a venv
@chazlarson
chazlarson / teamdrive.py
Last active July 1, 2020 22:53
You need to create a bunch of teamdrives, share them all with the same set of email addresses, and create cloudbox folder structures on them. Maybe this can help. mostly cribbed from here: https://wescpy.blogspot.com/2017/06/managing-team-drives-with-python-and.html
from __future__ import print_function
import uuid
from apiclient import discovery
from httplib2 import Http
from oauth2client import file, client, tools
# ##############################################################
# You need to install the Google API stuff
# There's a link on the page where I cribbed this:
@chazlarson
chazlarson / whatsnew
Created August 22, 2019 21:19
Save to your $PATH, make executable.
#!/bin/sh
# homebrew
echo "Checking homebrew packages..."
brew update > /dev/null;
new_packages=$(brew outdated --quiet)
num_packages=$(echo $new_packages | wc -w)
if [ $num_packages -gt 0 ]; then
echo "New package updates available:"
@chazlarson
chazlarson / app.domain.tld_location
Created June 30, 2019 17:11 — forked from fma965/app.domain.tld_location
Organizr auth working on Cloudbox
# FILE - /opt/nginx-proxy/vhost.d/app.domain.tld_location e.g sonarr.cloubox.com_location
auth_request /auth-2;
# optional failsafe basic auth
satisfy any;
auth_basic "Failsafe Authentication";
auth_basic_user_file /path/to/htpasswd;
## Full group list
@chazlarson
chazlarson / osx_bootstrap.sh
Created June 10, 2019 20:56 — forked from codeinthehole/osx_bootstrap.sh
Script to install stuff I want on a new OSX machine
#!/usr/bin/env bash
#
# Bootstrap script for setting up a new OSX machine
#
# This should be idempotent so it can be run multiple times.
#
# Some apps don't have a cask and so still need to be installed by hand. These
# include:
#
# - Twitter (app store)
@chazlarson
chazlarson / sa-batch-syncer.sh
Created May 28, 2019 17:18
Sync two rclone remotes, cycling through a set of service accounts at 500GB each. Assumptions here are: you have 100 service accounts. Their JSON credential files are named `sa-1.json` through `sa-100.json` and are stored in `/opt/sa-json`.
#!/bin/bash
COUNTER=1
SOURCE="remote:path"
DESTINATION="remote:path"
JSON_LOC="/opt/sa-json"
while [ $COUNTER -lt 100 ]; do
echo Using service account sa-$COUNTER
/usr/bin/rclone sync -v --delete-excluded \
--fast-list --checkers=32 --transfers=16 --max-transfer 500G \
--stats 5s --drive-service-account-file=$JSON_LOC/sa-$COUNTER.json \
@chazlarson
chazlarson / sa-batch-uploader.sh
Last active December 27, 2019 22:34
Upload a lot of stuff to a google drive [presumably a teamdrive], cycling through a set of service accounts at 500GB each. Assumptions here are: you have 100 service accounts. Their JSON credential files are named `sa-1.json` through `sa-100.json` and are stored in `/opt/sa-json`. The files you want to upload are in `/files/to/upload`
#!/bin/bash
COUNTER=1
SOURCE="/files/to/upload"
DESTINATION="remote:path"
JSON_LOC="/opt/sa-json"
while [ $COUNTER -lt 100 ]; do
echo Using service account sa-$COUNTER
/usr/bin/rclone copy -v --delete-excluded \
--fast-list --checkers=32 --transfers=16 --max-transfer 500G \
--stats 5s --drive-service-account-file=$JSON_LOC/sa-$COUNTER.json \