Skip to content

Instantly share code, notes, and snippets.

@akarak
akarak / hlssink4.py
Created January 12, 2025 00:01 — forked from SanchayanMaity/hlssink4.py
Test code for hlssink4
#!/usr/bin/env python3
import os # noqa: I001
import signal
import sys
import subprocess
import gi
gi.require_version("GLib", "2.0")
@akarak
akarak / sse-fastapi-redis.py
Created April 26, 2024 17:46 — forked from lbatteau/sse-fastapi-redis.py
Server-Sent Events in FastAPI with async Redis Pub/Sub
from aioredis import Channel, Redis
from fastapi import FastAPI
from fastapi.params import Depends
from fastapi_plugins import depends_redis, redis_plugin
from sse_starlette.sse import EventSourceResponse
from starlette.responses import HTMLResponse
html = """
<!DOCTYPE html>
<html>
@akarak
akarak / m3u8-to-mp4.md
Created January 26, 2023 22:15 — forked from tzmartin/m3u8-to-mp4.md
m3u8 stream to mp4 using ffmpeg

1. Copy m3u8 link

Alt text

2. Run command

echo "Enter m3u8 link:";read link;echo "Enter output filename:";read filename;ffmpeg -i "$link" -bsf:a aac_adtstoasc -vcodec copy -c copy -crf 50 $filename.mp4
@akarak
akarak / README.md
Created November 12, 2021 22:45 — forked from boneskull/README.md
MicroPython on ESP32: MQTT and DS18B20 temperature sensor full example
@akarak
akarak / OBS-SRT-DEBIAN.md
Created May 5, 2021 20:36 — forked from Kusmeroglu/OBS-SRT-DEBIAN.md
Notes on Installing SRT with OBS on Ubuntu / Debian / Linux

Notes on Installing SRT with OBS on Ubuntu / Debian / Linux

Feedback welcomed - if this becomes old or contains misleading information please let me know.

OBS provides some instructions on how to get SRT, but I found that they didn't quite work for me on a fresh minimal install of Ubuntu 20 (Focal). OBS on linux doesn't appear to come with SRT support out of the box.

The OBS Discord community was really helpful, but in the end it took a lot of futzing and following are my notes, in case this is helpful to anyone else.

By the way, if you are considering SRT with OBS, it was extremely painful to install correctly, but then it worked beautifully. The error correction SRT provides was very nice when my wireless signal was poor.

#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# GStreamer SDK Tutorials in Python
#
# basic-tutorial-2
#
"""
basic-tutorial-2: GStreamer concepts
http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+2%3A+GStreamer+concepts
"""
@akarak
akarak / kivy_decklink.py
Created April 4, 2020 16:43 — forked from cbenhagen/kivy_decklink.py
Example of using input from a Blackmagic Decklink or Ultrastudio card in kivy with the use of OpenCV and GStreamer.
I was able to find a VERY QUICK AND DIRTY way to use the media-autobuild suite to compile my own 64-bit static FFmpeg for Windows with the NDI library.
Download it and extract to a place on your computer, and keep note of the path. I put it in "D:\ndi\media-autobuild_suite-master", so for the sake of these instructions when you see "<autobuild>", you need to substitute whatever path you've put it in.
During the initial setup process, request to use the static build and add whatever else you'd like to have in your ffmpeg, then pause what you're doing when the on-screen prompts tell you the ffmpeg_options file has been written, then go into <autobuild>\build\ffmpeg_options.txt and add somewhere a line with
Code:
--enable-libndi_newtek
@akarak
akarak / centering.py
Last active December 3, 2019 16:36 — forked from saleph/centering.py
[qt5] center a window on screen
from PyQt5.QtWidgets import QWidget, QDesktopWidget, QApplication
class Example(QWidget):
def __init__(self):
super().__init__()
self.init_ui()
def init_ui(self):
self.resize(250, 150)

What do you render your After Effects videos to? Animated GIF's/PNG's. Creating a dynamic follow block.
-@LILTALK

It depends on how long, large, and complex the animation is, as well as what I plan to do with it.

If the animation can be broken down into parts and replicated in HTML/SVG/Canvas, then I do that. I go layer by layer and figure out how to replicate it entirely in code.

If the animation can't be easily replicated in code, or if it just isn't time-efficient to do so, I then decide if it would be easier to work with as an EaselJS sprite or a WebM video.

EaselJS sprites are great for small, intricate things that are no more than a couple seconds long. I export from After Effects as a PNG sequence, then use TexturePacker to create a single sprite sheets and generate the EaselJ