Skip to content

Instantly share code, notes, and snippets.

// The Y plane represents the luminance component, and the UV plane represents the Cb and Cr chroma components.
// In the case of kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format, you will find the luma plane is 8bpp with the same dimensions as your video, your chroma plane will be 16bpp, but only a quarter of the size of the original video. You will have one Cb and one Cr component per pixel on this plane.
// so if your input video is 352x288, your Y plane will be 352x288 8bpp, and your CbCr 176x144 16bpp. This works out to be about the same amount of data as a 12bpp 352x288 image, half what would be required for RGB888 and still less than RGB565.
// So in the buffer, Y will look like this [YYYYY . . . ] and UV [UVUVUVUVUV . . .]
// vs RGB being, of course, [RGBRGBRGB . . . ]
// https://stackoverflow.com/questions/13429456/how-seperate-y-planar-u-planar-and-uv-planar-from-yuv-bi-planar-in-ios
#pragma ARSessionDelegate
- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame{
CVPixelBuff
@markdaws
markdaws / CapturedImageSampler.swift
Created November 11, 2019 23:46 — forked from JoshuaSullivan/CapturedImageSampler.swift
The source code for an object that helps you sample RGB values from ARFrames.
//
// CapturedImageSampler.swift
// ARKitTest
//
// Created by Joshua Sullivan on 9/22/17.
// Copyright © 2017 Joshua Sullivan. All rights reserved.
//
import UIKit
import ARKit