Skip to content

Instantly share code, notes, and snippets.

[
{
"kind": "Listing",
"data": {
"modhash": null,
"dist": 1,
"children": [
{
"kind": "t3",
"data": {
  • The biggest change in Apollo 1.5 has been the most requested one, better private messages support! Now you can view and reply to Reddit messages in a beautiful, fully-threaded conversational message view, things like seeing sent messages are so great now. This includes being able to go into separate inbox sections, like purely viewing comment replies.
  • Added an awesome new Pro feature: auto collapsing child comments. Now when reading threads you can collapse everything that isn't a top-level comment, making reading the juiciest replies even easier. You can have it automatically always do it, or remember your setting per subreddit. Or manually!
  • ANOTHER awesome new Pro feature (well two): auto hiding read posts (keeps that feed fresh), and auto marking posts read on scroll. Also an option to have a permanent floating "hide read posts" button. Fixes a bug around hiding too.
  • AMAZING moderator features. I think Apollo has to now be one of the most full-featured apps for Reddit moderators out there. First, M
import UIKit
import AVFoundation
import Photos
import MobileCoreServices
class ViewController: UIViewController {
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
startVideoToGIFProcess()
let asset = AVURLAsset(url: url)
let generator = AVAssetImageGenerator(asset: asset)
generator.requestedTimeToleranceBefore = CMTime(seconds: 0.05, preferredTimescale: 600)
generator.requestedTimeToleranceAfter = CMTime(seconds: 0.05, preferredTimescale: 600)
let sizeModifier: CGFloat = 0.1
generator.maximumSize = CGSize(width: 450.0 * sizeModifier, height: 563.0 * sizeModifier)
let asset2 = AVURLAsset(url: url)
let generator2 = AVAssetImageGenerator(asset: asset2)
generator2.requestedTimeToleranceBefore = CMTime(seconds: 0.05, preferredTimescale: 600)
let startTime = CFAbsoluteTimeGetCurrent()
let totalFrames = 882
// `url` is a locally saved MP4
let asset = AVURLAsset(url: url)
let reader = try! AVAssetReader(asset: asset)
let videoTrack = asset.tracks(withMediaType: .video).first!
let outputSettings: [String: Any] = [
func cgImageFromSampleBuffer(_ buffer: CMSampleBuffer) -> CGImage? {
guard let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) else {
return nil
}
CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let operationQueue = OperationQueue()
operationQueue.maxConcurrentOperationCount = 1
while (sample != nil) {
if let newSample = sample {
let cgImage = self.cgImageFromSampleBuffer(newSample)!
operationQueue.addOperation { // Problematic son of a
CGImageDestinationAddImage(destination, cgImage, frameProperties as CFDictionary)
}
#import "ObjCRootViewController.h"
@import AVFoundation;
@import MobileCoreServices;
@import Photos;
@interface ObjCRootViewController ()
@end
@implementation ObjCRootViewController
override func viewDidLoad() {
super.viewDidLoad()
let data = try! Data(contentsOf: URL(string: "https://i.imgur.com/dXxP7a9.mp4")!)
let fileName = String(format: "%@_%@", ProcessInfo.processInfo.globallyUniqueString, "html5gif.mov")
let fileURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(fileName)
try! data.write(to: fileURL, options: [.atomic])
override func viewDidLoad() {
super.viewDidLoad()
let data = try! Data(contentsOf: URL(string: "https://i.imgur.com/dXxP7a9.mp4")!)
let fileName = String(format: "%@_%@", ProcessInfo.processInfo.globallyUniqueString, "html5gif.mov")
let fileURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(fileName)
try! data.write(to: fileURL, options: [.atomic])