Skip to content

Instantly share code, notes, and snippets.

I was able to find a VERY QUICK AND DIRTY way to use the media-autobuild suite to compile my own 64-bit static FFmpeg for Windows with the NDI library.
Download it and extract to a place on your computer, and keep note of the path. I put it in "D:\ndi\media-autobuild_suite-master", so for the sake of these instructions when you see "<autobuild>", you need to substitute whatever path you've put it in.
During the initial setup process, request to use the static build and add whatever else you'd like to have in your ffmpeg, then pause what you're doing when the on-screen prompts tell you the ffmpeg_options file has been written, then go into <autobuild>\build\ffmpeg_options.txt and add somewhere a line with
Code:
--enable-libndi_newtek
@akarak
akarak / centering.py
Last active December 3, 2019 16:36 — forked from saleph/centering.py
[qt5] center a window on screen
from PyQt5.QtWidgets import QWidget, QDesktopWidget, QApplication
class Example(QWidget):
def __init__(self):
super().__init__()
self.init_ui()
def init_ui(self):
self.resize(250, 150)

What do you render your After Effects videos to? Animated GIF's/PNG's. Creating a dynamic follow block.
-@LILTALK

It depends on how long, large, and complex the animation is, as well as what I plan to do with it.

If the animation can be broken down into parts and replicated in HTML/SVG/Canvas, then I do that. I go layer by layer and figure out how to replicate it entirely in code.

If the animation can't be easily replicated in code, or if it just isn't time-efficient to do so, I then decide if it would be easier to work with as an EaselJS sprite or a WebM video.

EaselJS sprites are great for small, intricate things that are no more than a couple seconds long. I export from After Effects as a PNG sequence, then use TexturePacker to create a single sprite sheets and generate the EaselJ

@akarak
akarak / disable-gpu-compositing.md
Created July 6, 2019 19:52
OBS Studio Browser Source --disable-gpu-compositing

To pass the --disable-gpu-compositing CEF flag to OBS Studio's Browser Source plugin, you'll need to add it to your OBS Studio shortcut.

Windows

  1. Locate your shortcut to OBS Studio.
  2. Right click the shortcut and hit "Properties".
  3. Add the following to the end of the "Target" field. Be sure to put a space before it: --disable-gpu-compositing
  4. Launch OBS Studio via this shortcut, and try out a Browser Source animation to verify that it does indeed appear smoother.
  5. OBS Studio updates usually delete and re-make this shortcut, so you may need to repeat these steps after every OBS Studio update.

FAQ

@akarak
akarak / casparcg_ndi_instructions.md
Created July 6, 2019 19:50
How to get NDI output to work in CasparCG

How to get NDI output to work in CasparCG

  1. Install NewTek Network Video Send.exe
  2. Install NewTek NDI AirSend Updater.exe
  3. The AirSend Updater has a bug, and will not properly copy the x64 version of the DLL to the correct place. So, you'll need to manually copy C:\Program Files\NewTek\NewTek NDI AirSend Updater\Processing.AirSend.x64.dll to C:\Windows\System32\, overwriting the existing file.
  4. Restart your PC.
  5. Configure the CasparCG Server to use a Newtek iVGA output. Even though it says iVGA, it will actually be outputting NDI thanks to the updated AirSend DLLs.
  • You can do this manually by adding a `` consumer to your casparcg.config, or you can do it via the third-party [Caspar
@akarak
akarak / pyav_example.py
Created June 15, 2019 21:20 — forked from w495/pyav_example.py
Пример работы с видео в Python. В примере из видео достают кадры и конвертируют их в numpy-вектора размером `32 X 32`. После из полученных векторов собирают новое видео. Создано по мотивам http://ru.stackoverflow.com/questions/519636/#551344
import av
from av.video.frame import VideoFrame
from av.video.stream import VideoStream
# В этом списке будем хранить кадры в виде numpy-векторов.
array_list = []
# Откроем контейнер на чтение
input_container = av.open('input.mp4')
@akarak
akarak / pyav_rtmp_example.py
Created June 15, 2019 21:15 — forked from w495/pyav_rtmp_example.py
Пример работы с видео в Python. Есть RTMP-поток и RTMP-сервер, на который нужно отправить этот поток. Создано по мотивам https://ru.stackoverflow.com/a/664973/203032
# -*- coding: utf8 -*-
import av
# Откроем ресурс на чтение
input_resource = av.open(
'rtmp://src_stream:1935/play'
)
# Откроем ресурс на запись.
output_resource = av.open(
@akarak
akarak / pycon-views.py
Created June 13, 2019 20:59 — forked from miguelgrinberg/pycon-views.py
Generate statistics about PyCon 2014 videos
import argparse
import re
from multiprocessing.pool import ThreadPool as Pool
import requests
import bs4
root_url = 'http://pyvideo.org'
index_url = root_url + '/category/50/pycon-us-2014'
@akarak
akarak / Get-VideoHubRouting.ps1
Created June 4, 2019 10:34 — forked from imorrish/Get-VideoHubRouting.ps1
Create Excel Matrix showing Blackmagic VideoHub routing
#Get video hub properties and add to Excel file
# Author: Ian Morrish
# Website: https://ianmorrish.wordpress.com
# tested on Windows 10 running in ISE with Excel 2016
# update the IP address and path to included send-tcprequest.ps1 scrit as required
$VideoHupIP = "127.0.0.1"
$hubcommand = @"
VIDEO OUTPUT ROUTING:
`r`n
# Script created by Ian Morrish
# See https://ianmorrish.wordpress.com/2019/03/02/atem-wireless-multiviewer-with-touch-switching/
# Run Newtek studio monitor first, select input and set to full screen.
# this script requires my switcherlib.dll which you can get from the site above
# after starting this script, <ALT><Tab> to see the multiview live video feed (I will automate this soon)
#Show-Process "overlay"
# Connect to ATEM
function ConnectToATEM()