Skip to content

Instantly share code, notes, and snippets.

@wood1986
wood1986 / yuv_buffer_to_image.swift
Created April 14, 2024 05:24 — forked from lobianco/yuv_buffer_to_image.swift
Converts a frame from a YUV video sample buffer (like what RPScreenRecorder provides) into a UIImage.
import Accelerate
import CoreGraphics
import CoreMedia
import Foundation
import QuartzCore
import UIKit
func createImage(from sampleBuffer: CMSampleBuffer) -> UIImage? {
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return nil
@wood1986
wood1986 / MTLARFrameProcessor.swift
Created April 14, 2024 05:23 — forked from ctreffs/MTLARFrameProcessor.swift
Processes ARKits' ARFrame->capturedImage CVPixelBuffer according to the documentation into an sRGB image
import CoreImage
import CoreVideo
import ImageIO
import Metal
/// Processes ARKits' ARFrame->capturedImage CVPixelBuffer according to the documentation into an sRGB image.
///
/// ARKit captures pixel buffers in a full-range planar YCbCr format (also known as YUV) format according to the ITU R. 601-4 standard.
/// (You can verify this by checking the kCVImageBufferYCbCrMatrixKey pixel buffer attachment.)
/// Unlike some uses of that standard, ARKit captures full-range color space values, not video-range values.