Skip to content

Instantly share code, notes, and snippets.

@acj
Last active May 5, 2023 12:23
Show Gist options
  • Save acj/6ae90aa1ebb8cad6b47b to your computer and use it in GitHub Desktop.
Save acj/6ae90aa1ebb8cad6b47b to your computer and use it in GitHub Desktop.
Build a movie from jpeg images in Swift using AVFoundation

This code has moved

Please refer to the TimeLapseBuilder-Swift repository on GitHub from now on.

I will leave the original code here as a reference, but new comments may be removed. Please open an issue on GitHub if you have questions or would like to contribute.

Thanks!

//
// BuildTimelapseViewController.swift
//
// Created by Adam Jensen on 5/9/15.
//
import JGProgressHUD
import JoePro
import UIKit
class BuildTimelapseViewController: UIViewController {
@IBOutlet weak var resolutionSegmentedControl: UISegmentedControl!
@IBOutlet weak var speedSlider: UISlider!
@IBOutlet weak var removeFisheyeSlider: UISwitch!
var album: String?
var camera: JoeProCamera?
var timeLapseBuilder: TimeLapseBuilder?
init(camera: JoeProCamera, album: String) {
self.camera = camera
self.album = album
super.init(nibName: "BuildTimelapseViewController", bundle: nil)
}
required init(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override func viewDidLoad() {
super.viewDidLoad()
}
@IBAction func buildButtonTapped(sender: AnyObject) {
if let camera = camera,
let album = album {
let progressHUD = JGProgressHUD(style: .Light)
progressHUD.textLabel.text = "Building your timelapse..."
progressHUD.indicatorView = JGProgressHUDRingIndicatorView(HUDStyle: .Light)
progressHUD.setProgress(0, animated: true)
progressHUD.showInView(view)
camera.listOfVideos(album) { (videos) -> Void in
self.timeLapseBuilder = TimeLapseBuilder(photoURLs: videos)
self.timeLapseBuilder!.build(
{ (progress: NSProgress) in
NSLog("Progress: \(progress.completedUnitCount) / \(progress.totalUnitCount)")
dispatch_async(dispatch_get_main_queue(), {
let progressPercentage = Float(progress.completedUnitCount) / Float(progress.totalUnitCount)
progressHUD.setProgress(progressPercentage, animated: true)
})
},
success: { url in
NSLog("Output written to \(url)")
dispatch_async(dispatch_get_main_queue(), {
progressHUD.dismiss()
})
},
failure: { error in
NSLog("failure: \(error)")
dispatch_async(dispatch_get_main_queue(), {
progressHUD.dismiss()
})
}
)
}
}
}
}
//
// TimeLapseBuilder.swift
//
// Created by Adam Jensen on 5/10/15.
//
// NOTE: This is the original Swift 1.2 implementation. For an updated version
// written in Swift 2.0, see https://gist.github.com/acj/6ae90aa1ebb8cad6b47b
import AVFoundation
import UIKit
let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1
class TimeLapseBuilder: NSObject {
let photoURLs: [String]
var videoWriter: AVAssetWriter?
init(photoURLs: [String]) {
self.photoURLs = photoURLs
}
func build(progress: (NSProgress -> Void), success: (NSURL -> Void), failure: (NSError -> Void)) {
let inputSize = CGSize(width: 4000, height: 3000)
let outputSize = CGSize(width: 1280, height: 720)
var error: NSError?
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as! NSString
let videoOutputURL = NSURL(fileURLWithPath: documentsPath.stringByAppendingPathComponent("AssembledVideo.mov"))!
NSFileManager.defaultManager().removeItemAtURL(videoOutputURL, error: nil)
videoWriter = AVAssetWriter(URL: videoOutputURL, fileType: AVFileTypeQuickTimeMovie, error: &error)
if let videoWriter = videoWriter {
let videoSettings: [NSObject : AnyObject] = [
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : outputSize.width,
AVVideoHeightKey : outputSize.height,
// AVVideoCompressionPropertiesKey : [
// AVVideoAverageBitRateKey : NSInteger(1000000),
// AVVideoMaxKeyFrameIntervalKey : NSInteger(16),
// AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
// ]
]
let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
assetWriterInput: videoWriterInput,
sourcePixelBufferAttributes: [
kCVPixelBufferPixelFormatTypeKey : kCVPixelFormatType_32ARGB,
kCVPixelBufferWidthKey : inputSize.width,
kCVPixelBufferHeightKey : inputSize.height,
]
)
assert(videoWriter.canAddInput(videoWriterInput))
videoWriter.addInput(videoWriterInput)
if videoWriter.startWriting() {
videoWriter.startSessionAtSourceTime(kCMTimeZero)
assert(pixelBufferAdaptor.pixelBufferPool != nil)
let media_queue = dispatch_queue_create("mediaInputQueue", nil)
videoWriterInput.requestMediaDataWhenReadyOnQueue(media_queue, usingBlock: { () -> Void in
let fps: Int32 = 30
let frameDuration = CMTimeMake(1, fps)
let currentProgress = NSProgress(totalUnitCount: Int64(self.photoURLs.count))
var frameCount: Int64 = 0
var remainingPhotoURLs = [String](self.photoURLs)
while (videoWriterInput.readyForMoreMediaData && !remainingPhotoURLs.isEmpty) {
let nextPhotoURL = remainingPhotoURLs.removeAtIndex(0)
let lastFrameTime = CMTimeMake(frameCount, fps)
let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
if !self.appendPixelBufferForImageAtURL(nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
error = NSError(
domain: kErrorDomain,
code: kFailedToAppendPixelBufferError,
userInfo: [
"description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer",
"rawError": videoWriter.error ?? "(none)"
]
)
break
}
frameCount++
currentProgress.completedUnitCount = frameCount
progress(currentProgress)
}
videoWriterInput.markAsFinished()
videoWriter.finishWritingWithCompletionHandler { () -> Void in
if error == nil {
success(videoOutputURL)
}
}
})
} else {
error = NSError(
domain: kErrorDomain,
code: kFailedToStartAssetWriterError,
userInfo: ["description": "AVAssetWriter failed to start writing"]
)
}
}
if let error = error {
failure(error)
}
}
func appendPixelBufferForImageAtURL(url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
var appendSucceeded = true
autoreleasepool {
if let url = NSURL(string: url),
let imageData = NSData(contentsOfURL: url),
let image = UIImage(data: imageData) {
var pixelBuffer: Unmanaged<CVPixelBuffer>?
let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
kCFAllocatorDefault,
pixelBufferAdaptor.pixelBufferPool,
&pixelBuffer
)
if let pixelBuffer = pixelBuffer where status == 0 {
let managedPixelBuffer = pixelBuffer.takeRetainedValue()
fillPixelBufferFromImage(image, pixelBuffer: managedPixelBuffer)
appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(
managedPixelBuffer,
withPresentationTime: presentationTime
)
} else {
NSLog("error: Failed to allocate pixel buffer from pool")
}
}
}
return appendSucceeded
}
func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
let imageData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
let lockStatus = CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGBitmapContextCreate(
pixelData,
Int(image.size.width),
Int(image.size.height),
8,
Int(4 * image.size.width),
rgbColorSpace,
bitmapInfo
)
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
}
}
//
// TimeLapseBuilder.swift
// Vapor
//
// Created by Adam Jensen on 5/10/15.
//
// NOTE: This implementation is written in Swift 2.0.
import AVFoundation
import UIKit
let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1
class TimeLapseBuilder: NSObject {
let photoURLs: [String]
var videoWriter: AVAssetWriter?
init(photoURLs: [String]) {
self.photoURLs = photoURLs
}
func build(progress: (NSProgress -> Void), success: (NSURL -> Void), failure: (NSError -> Void)) {
let inputSize = CGSize(width: 4000, height: 3000)
let outputSize = CGSize(width: 1280, height: 720)
var error: NSError?
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as NSString
let videoOutputURL = NSURL(fileURLWithPath: documentsPath.stringByAppendingPathComponent("AssembledVideo.mov"))
do {
try NSFileManager.defaultManager().removeItemAtURL(videoOutputURL)
} catch {}
do {
try videoWriter = AVAssetWriter(URL: videoOutputURL, fileType: AVFileTypeQuickTimeMovie)
} catch let writerError as NSError {
error = writerError
videoWriter = nil
}
if let videoWriter = videoWriter {
let videoSettings: [String : AnyObject] = [
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : outputSize.width,
AVVideoHeightKey : outputSize.height,
// AVVideoCompressionPropertiesKey : [
// AVVideoAverageBitRateKey : NSInteger(1000000),
// AVVideoMaxKeyFrameIntervalKey : NSInteger(16),
// AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
// ]
]
let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
let sourceBufferAttributes = [String : AnyObject](dictionaryLiteral:
(kCVPixelBufferPixelFormatTypeKey as String, Int(kCVPixelFormatType_32ARGB)),
(kCVPixelBufferWidthKey as String, Float(inputSize.width)),
(kCVPixelBufferHeightKey as String, Float(inputSize.height))
)
let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
assetWriterInput: videoWriterInput,
sourcePixelBufferAttributes: sourceBufferAttributes
)
assert(videoWriter.canAddInput(videoWriterInput))
videoWriter.addInput(videoWriterInput)
if videoWriter.startWriting() {
videoWriter.startSessionAtSourceTime(kCMTimeZero)
assert(pixelBufferAdaptor.pixelBufferPool != nil)
let media_queue = dispatch_queue_create("mediaInputQueue", nil)
videoWriterInput.requestMediaDataWhenReadyOnQueue(media_queue, usingBlock: { () -> Void in
let fps: Int32 = 30
let frameDuration = CMTimeMake(1, fps)
let currentProgress = NSProgress(totalUnitCount: Int64(self.photoURLs.count))
var frameCount: Int64 = 0
var remainingPhotoURLs = [String](self.photoURLs)
while (videoWriterInput.readyForMoreMediaData && !remainingPhotoURLs.isEmpty) {
let nextPhotoURL = remainingPhotoURLs.removeAtIndex(0)
let lastFrameTime = CMTimeMake(frameCount, fps)
let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
if !self.appendPixelBufferForImageAtURL(nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
error = NSError(
domain: kErrorDomain,
code: kFailedToAppendPixelBufferError,
userInfo: [
"description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer",
"rawError": videoWriter.error ?? "(none)"
]
)
break
}
frameCount++
currentProgress.completedUnitCount = frameCount
progress(currentProgress)
}
videoWriterInput.markAsFinished()
videoWriter.finishWritingWithCompletionHandler { () -> Void in
if error == nil {
success(videoOutputURL)
}
self.videoWriter = nil
}
})
} else {
error = NSError(
domain: kErrorDomain,
code: kFailedToStartAssetWriterError,
userInfo: ["description": "AVAssetWriter failed to start writing"]
)
}
}
if let error = error {
failure(error)
}
}
func appendPixelBufferForImageAtURL(url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
var appendSucceeded = false
autoreleasepool {
if let url = NSURL(string: url),
let imageData = NSData(contentsOfURL: url),
let image = UIImage(data: imageData),
let pixelBufferPool = pixelBufferAdaptor.pixelBufferPool {
let pixelBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.alloc(1)
let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
kCFAllocatorDefault,
pixelBufferPool,
pixelBufferPointer
)
if let pixelBuffer = pixelBufferPointer.memory where status == 0 {
fillPixelBufferFromImage(image, pixelBuffer: pixelBuffer)
appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(
pixelBuffer,
withPresentationTime: presentationTime
)
pixelBufferPointer.destroy()
} else {
NSLog("error: Failed to allocate pixel buffer from pool")
}
pixelBufferPointer.dealloc(1)
}
}
return appendSucceeded
}
func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGBitmapContextCreate(
pixelData,
Int(image.size.width),
Int(image.size.height),
8,
CVPixelBufferGetBytesPerRow(pixelBuffer),
rgbColorSpace,
CGImageAlphaInfo.PremultipliedFirst.rawValue
)
CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
}
}
//
// TimeLapseBuilder30.swift
//
// Created by Adam Jensen on 11/18/16.
//
// NOTE: This implementation is written in Swift 3.0.
import AVFoundation
import UIKit
let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1
class TimeLapseBuilder: NSObject {
let photoURLs: [String]
var videoWriter: AVAssetWriter?
init(photoURLs: [String]) {
self.photoURLs = photoURLs
}
func build(_ progress: @escaping ((Progress) -> Void), success: @escaping ((URL) -> Void), failure: ((NSError) -> Void)) {
let inputSize = CGSize(width: 4000, height: 3000)
let outputSize = CGSize(width: 1280, height: 720)
var error: NSError?
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
let videoOutputURL = URL(fileURLWithPath: documentsPath.appendingPathComponent("AssembledVideo.mov"))
do {
try FileManager.default.removeItem(at: videoOutputURL)
} catch {}
do {
try videoWriter = AVAssetWriter(outputURL: videoOutputURL, fileType: AVFileTypeQuickTimeMovie)
} catch let writerError as NSError {
error = writerError
videoWriter = nil
}
if let videoWriter = videoWriter {
let videoSettings: [String : AnyObject] = [
AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
AVVideoWidthKey : outputSize.width as AnyObject,
AVVideoHeightKey : outputSize.height as AnyObject,
// AVVideoCompressionPropertiesKey : [
// AVVideoAverageBitRateKey : NSInteger(1000000),
// AVVideoMaxKeyFrameIntervalKey : NSInteger(16),
// AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
// ]
]
let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
let sourceBufferAttributes = [
(kCVPixelBufferPixelFormatTypeKey as String): Int(kCVPixelFormatType_32ARGB),
(kCVPixelBufferWidthKey as String): Float(inputSize.width),
(kCVPixelBufferHeightKey as String): Float(inputSize.height)] as [String : Any]
let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
assetWriterInput: videoWriterInput,
sourcePixelBufferAttributes: sourceBufferAttributes
)
assert(videoWriter.canAdd(videoWriterInput))
videoWriter.add(videoWriterInput)
if videoWriter.startWriting() {
videoWriter.startSession(atSourceTime: kCMTimeZero)
assert(pixelBufferAdaptor.pixelBufferPool != nil)
let media_queue = DispatchQueue(label: "mediaInputQueue")
videoWriterInput.requestMediaDataWhenReady(on: media_queue) {
let fps: Int32 = 30
let frameDuration = CMTimeMake(1, fps)
let currentProgress = Progress(totalUnitCount: Int64(self.photoURLs.count))
var frameCount: Int64 = 0
var remainingPhotoURLs = [String](self.photoURLs)
while videoWriterInput.isReadyForMoreMediaData && !remainingPhotoURLs.isEmpty {
let nextPhotoURL = remainingPhotoURLs.remove(at: 0)
let lastFrameTime = CMTimeMake(frameCount, fps)
let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
if !self.appendPixelBufferForImageAtURL(nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
error = NSError(
domain: kErrorDomain,
code: kFailedToAppendPixelBufferError,
userInfo: ["description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer"]
)
break
}
frameCount += 1
currentProgress.completedUnitCount = frameCount
progress(currentProgress)
}
videoWriterInput.markAsFinished()
videoWriter.finishWriting {
if error == nil {
success(videoOutputURL)
}
self.videoWriter = nil
}
}
} else {
error = NSError(
domain: kErrorDomain,
code: kFailedToStartAssetWriterError,
userInfo: ["description": "AVAssetWriter failed to start writing"]
)
}
}
if let error = error {
failure(error)
}
}
func appendPixelBufferForImageAtURL(_ url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
var appendSucceeded = false
autoreleasepool {
if let url = URL(string: url),
let imageData = try? Data(contentsOf: url),
let image = UIImage(data: imageData),
let pixelBufferPool = pixelBufferAdaptor.pixelBufferPool {
let pixelBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.allocate(capacity: 1)
let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
kCFAllocatorDefault,
pixelBufferPool,
pixelBufferPointer
)
if let pixelBuffer = pixelBufferPointer.pointee, status == 0 {
fillPixelBufferFromImage(image, pixelBuffer: pixelBuffer)
appendSucceeded = pixelBufferAdaptor.append(
pixelBuffer,
withPresentationTime: presentationTime
)
pixelBufferPointer.deinitialize()
} else {
NSLog("error: Failed to allocate pixel buffer from pool")
}
pixelBufferPointer.deallocate(capacity: 1)
}
}
return appendSucceeded
}
func fillPixelBufferFromImage(_ image: UIImage, pixelBuffer: CVPixelBuffer) {
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGContext(
data: pixelData,
width: Int(image.size.width),
height: Int(image.size.height),
bitsPerComponent: 8,
bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer),
space: rgbColorSpace,
bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue
)
context?.draw(image.cgImage!, in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
}
}
@plutovman
Copy link

plutovman commented Aug 15, 2015

Adam,
Your code seems to be about the only swift example of how to create movies from images...I was able to modify it to simply create a movie from a series of jpegs...the codes runs swimmingly, and even writes out a movie file with no errors...However, the file is empty...

I was wondering if you could be so kind as to suggest where I might be going wrong or what I might do to troubleshoot? I've ensured my images are getting loaded, have run both on simulator and on iPad, I've changed the path where the movie gets written to, all to no avail. Any help would be appreciated.

Thanks in advance.

I load the images like so...

var imageSequenceList = [UIImage]()
var rawImageList = [String]()
for i in 0 ... 534 {
  let imageName = ("test." + formatter.stringFromNumber(i)! + ".jpg")
  println (imageName)
  imageSequenceList.append(UIImage(named: imageName)!)
  rawImageList.append(imageName)
}

and I modified the "buildButtonTapped" IBAction to strip out references to a camera device like so...

@IBAction func buildButtonTapped(sender: AnyObject) {
    let progressHUD = JGProgressHUD(style: .Dark)
    progressHUD.textLabel.text = "Building your timelapse..."
    progressHUD.indicatorView = JGProgressHUDRingIndicatorView(HUDStyle: .Light)
    progressHUD.setProgress(0, animated: true)
    progressHUD.showInView(view)

      self.timeLapseBuilder = TimeLapseBuilder(photoURLs: rawImageList)
      self.timeLapseBuilder!.build(
        { (progress: NSProgress) in
          NSLog("Progress: \(progress.completedUnitCount) / \(progress.totalUnitCount)")
          dispatch_async(dispatch_get_main_queue(), {
            let progressPercentage = Float(progress.completedUnitCount) / Float(progress.totalUnitCount)
            progressHUD.setProgress(progressPercentage, animated: true)
          })
        },
        success: { url in
          NSLog("Output written to \(url)")
          dispatch_async(dispatch_get_main_queue(), {
            progressHUD.dismiss()
          })
        },
        failure: { error in
          NSLog("failure: \(error)")
          dispatch_async(dispatch_get_main_queue(), {
            progressHUD.dismiss()
          })
        }
      )

The other change I've made is I have added a bit of code that copies the movie file to the SavedPhotosAlbum, but since the file has no data, nothing gets copied...

      videoWriter.finishWritingWithCompletionHandler { () -> Void in
        if videoWriter.status == AVAssetWriterStatus.Failed {
          println("VIDEO WRITER ERROR: \(videoWriter.error.description)")
        } else {
          if filemgr.fileExistsAtPath(outputPath as String) {

            dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
              if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputPath) {  ///(fileURL) {
                var complete : ALAssetsLibraryWriteVideoCompletionBlock = {reason in println("reason \(reason)")}
                UISaveVideoAtPathToSavedPhotosAlbum(outputPath as String, self, "savingCallBack:didFinishSavingWithError:contextInfo:", nil)
              } else {
                println("the file must be bad!")
              }
            });
          } else {
            println("there is no file")
          }
        }
      }

@plutovman
Copy link

I've posted the relevant files below for troubleshooting.
https://gist.github.com/plutovman/bc6f2f43794d924def30

@plutovman
Copy link

I have solved the issue, and of course it had to do with the way that i was feeding image data into

fillPixelBufferFromImage(image, pixelBuffer: managedPixelBuffer)

providing a correct url turned out to be trickier than i thought...

////
println ("...urlstring (urlstring)")
let test = NSURL(fileURLWithPath: urlstring)
println ("...url (test!)")
if let url = NSURL(fileURLWithPath: urlstring), let imageData = NSData(contentsOfURL: test!),let image = UIImage(data: imageData)

/////

At any rate, thank you again for posting this wonderful piece of code

@acj
Copy link
Author

acj commented Aug 19, 2015

@plutovman I'm glad to hear that you solved the problem. I'm sorry (but not surprised) that it was silently failing when the image data was invalid. The error handling is nowhere near complete.

If you make any improvements to TimeLapseBuilder, please let me know. I'm planning to return to this code soon and will probably revise it a bit, do proper sanity/error checking, etc.

Good luck!

@justinlevi
Copy link

justinlevi commented Sep 12, 2015

Hey guys, I'm working on trying to upgrade this example to Swift 2 and have a pretty simple project posted here

https://github.com/justinlevi/imagesToVideo/tree/master

I'm getting all black frames for some reason though. I'm pretty new to working with pixelBuffers so it's probably something silly I'm missing.

Here's my View Controller

import UIKit

class ViewController: UIViewController {

  override func viewDidLoad() {
    super.viewDidLoad()

    let path = NSBundle.mainBundle().pathForResource("hasselblad-01", ofType: "jpg")!

    var photosArray = [String]()
    for _ in 0...30 {
      photosArray.append(path)
    }

    let tlb = TimeLapseBuilder(photoURLs: photosArray)
    tlb.build({ (progress) -> Void in

      }, success: { (url) -> Void in
        print("SUCCESS: \(url)")
      }) { (error) -> Void in
        print(error)
    }

  }

}
//
//  TimeLapseBuilder.swift
//
//  Created by Adam Jensen on 5/10/15.
//  Copyright (c) 2015 Adam Jensen. All rights reserved.
//

import AVFoundation
import UIKit

let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1

public class TimeLapseBuilder: NSObject {
  let photoURLs: [String]
  var videoWriter: AVAssetWriter?

  public init(photoURLs: [String]) {
    self.photoURLs = photoURLs

    super.init()
  }

  public func build(progress: (NSProgress -> Void), success: (NSURL -> Void), failure: (NSError -> Void)) {
    let inputSize = CGSize(width: 600, height: 600)
    let outputSize = CGSize(width: 600, height: 600)
    var error: NSError?

    let fileManager = NSFileManager.defaultManager()
    let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
    guard let documentDirectory: NSURL = urls.first else {
      fatalError("documentDir Error")
    }

    let videoOutputURL = documentDirectory.URLByAppendingPathComponent("AssembledVideo.mov")

    if NSFileManager.defaultManager().fileExistsAtPath(videoOutputURL.path!) {
      do {
        try NSFileManager.defaultManager().removeItemAtPath(videoOutputURL.path!)
      }catch{
        fatalError("Unable to delete file: \(error) : \(__FUNCTION__).")
      }
    }

    guard let videoWriter = try? AVAssetWriter(URL: videoOutputURL, fileType: AVFileTypeQuickTimeMovie) else{
      fatalError("AVAssetWriter error")
    }

    let outputSettings = [
      AVVideoCodecKey  : AVVideoCodecH264,
      AVVideoWidthKey  : NSNumber(int: Int32(outputSize.width)),
      AVVideoHeightKey : NSNumber(int: Int32(outputSize.height)),
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)

    let sourcePixelBufferAttributesDictionary = [
      kCVPixelBufferPixelFormatTypeKey as String: NSNumber(unsignedInt: kCVPixelFormatType_32BGRA),
      kCVPixelBufferWidthKey as String: NSNumber(int: Int32(inputSize.width)),
      kCVPixelBufferHeightKey as String: NSNumber(int: Int32(inputSize.height))
    ]

    let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
      assetWriterInput: videoWriterInput,
      sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary
    )

    assert(videoWriter.canAddInput(videoWriterInput))
    videoWriter.addInput(videoWriterInput)

    if videoWriter.startWriting() {
      videoWriter.startSessionAtSourceTime(kCMTimeZero)
      assert(pixelBufferAdaptor.pixelBufferPool != nil)

      let media_queue = dispatch_queue_create("mediaInputQueue", nil)

      videoWriterInput.requestMediaDataWhenReadyOnQueue(media_queue, usingBlock: { () -> Void in
        let fps: Int32 = 1
        let frameDuration = CMTimeMake(1, fps)
        let currentProgress = NSProgress(totalUnitCount: Int64(self.photoURLs.count))

        var frameCount: Int64 = 0
        var remainingPhotoURLs = [String](self.photoURLs)

        while (videoWriterInput.readyForMoreMediaData && !remainingPhotoURLs.isEmpty) {
          let nextPhotoURL = remainingPhotoURLs.removeAtIndex(0)
          let lastFrameTime = CMTimeMake(frameCount, fps)
          let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)


          if !self.appendPixelBufferForImageAtURL(nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
            error = NSError(
              domain: kErrorDomain,
              code: kFailedToAppendPixelBufferError,
              userInfo: [
                "description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer",
                "rawError": videoWriter.error ?? "(none)"
              ]
            )

            break
          }

          frameCount++

          currentProgress.completedUnitCount = frameCount
          progress(currentProgress)
        }

        videoWriterInput.markAsFinished()
        videoWriter.finishWritingWithCompletionHandler { () -> Void in
          if error == nil {
            success(videoOutputURL)
          }
        }
      })
    } else {
      error = NSError(
        domain: kErrorDomain,
        code: kFailedToStartAssetWriterError,
        userInfo: ["description": "AVAssetWriter failed to start writing"]
      )
    }

    if let error = error {
      failure(error)
    }
  }

  public func appendPixelBufferForImageAtURL(urlString: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
    var appendSucceeded = true

    autoreleasepool {
      if let image = UIImage(contentsOfFile: urlString) {

          var pixelBuffer: CVPixelBuffer? = nil
          let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
            kCFAllocatorDefault,
            pixelBufferAdaptor.pixelBufferPool!,
            &pixelBuffer
          )

          if let pixelBuffer = pixelBuffer where status == 0 {
            let managedPixelBuffer = pixelBuffer

            fillPixelBufferFromImage(image, pixelBuffer: managedPixelBuffer)
            appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(managedPixelBuffer, withPresentationTime: presentationTime)

          } else {
            NSLog("error: Failed to allocate pixel buffer from pool")
          }
      }
    }

    return appendSucceeded
  }

  public func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
    let imageData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
    let lockStatus = CVPixelBufferLockBaseAddress(pixelBuffer, 0)

    let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)


    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()

    let context = CGBitmapContextCreate(
      pixelData,
      Int(image.size.width),
      Int(image.size.height),
      8,
      Int(4 * image.size.width),
      rgbColorSpace,
      CGImageAlphaInfo.PremultipliedFirst.rawValue
    )

    CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
  }
}

@justinlevi
Copy link

justinlevi commented Sep 15, 2015

Was able to get things figured out with the following update to fillPixelBufferFromImage

  func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBufferRef) {
    let imageData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage))
    CVPixelBufferLockBaseAddress(pixelBuffer, 0)


    let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
    let bitmapInfo:CGBitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue)
    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()

    let context = CGBitmapContextCreate(
      pixelData,
      Int(image.size.width),
      Int(image.size.height),
      8,
      CVPixelBufferGetBytesPerRow(pixelBuffer),
      rgbColorSpace,
      bitmapInfo.rawValue
    )

    CGContextDrawImage(context, CGRectMake(0, 0, image.size.width, image.size.height), image.CGImage)

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
  }

@justinlevi
Copy link

justinlevi commented Sep 15, 2015

I ended up rewriting the fillPixelBufferFromImage using an Obj-C example I found here:
http://stackoverflow.com/questions/25611086/cvpixelbufferpool-error-kcvreturninvalidargument-6661

func fillPixelBufferFromImage(image: CGImage, pixelBuffer: CVPixelBuffer){
    let frameSize = CGSizeMake(CGFloat(CGImageGetWidth(image)), CGFloat(CGImageGetHeight(image)))

    CVPixelBufferLockBaseAddress(pixelBuffer, 0)
    let data = CVPixelBufferGetBaseAddress(pixelBuffer)
    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
    let context = CGBitmapContextCreate(data, Int(frameSize.width), Int(frameSize.height), 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace, CGImageAlphaInfo.PremultipliedFirst.rawValue)
    CGContextDrawImage(context, CGRectMake(0, 0, CGFloat(CGImageGetWidth(image)), CGFloat(CGImageGetHeight(image))), image)
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0)
  }

Seems to actually be a touch faster for me as well.

@acj
Copy link
Author

acj commented Oct 17, 2015

@justinlevi Thanks for sharing your progress! I've posted an updated implementation for Swift 2.0 that takes a similar approach to yours.

I'm seeing a slight performance increase as well. I think that's due to removing the (completely unnecessary) call to CGDataProviderCopyData, whose result never got used anyway. My mistake.

@KyleDEV
Copy link

KyleDEV commented Oct 28, 2015

Hi.

I'm building an app using this code.
I need to make video from over 1000 images.
but it crashes when because of memory increasement around the 150th image .
Xcode says 'Message from debugger: Terminated due to memory issue'
Do you know where to start looking at?

@acj
Copy link
Author

acj commented Nov 1, 2015

@KyleDEV Are you using the original (Swift 1.2) implementation or the one for Swift 2.0? Which platform are you targeting?

EDIT: I've updated the Swift 2 implementation to clean up CVPixelBuffer instances after it's done using them. That change seems to fix the memory leak for me.

@acj
Copy link
Author

acj commented Mar 29, 2016

@panabee For the photo sets I used in my testing, readyForMoreMediaData was always true and didn't trigger the scenario that you're describing. You're right that it could present a problem, though, and for the sake of robustness you probably should maintain that state (frameCount, etc) outside of the block.

@Baldman68
Copy link

Hi, first thank you for writing this!! I am trying to adapt your code into my app, but I am running into an issue in appendPixelBufferForImageAtURL. Forgive my noobness on NSURL's but I either fail on "let url = NSURL (string: url)" or "image = UIIMage(data: imageData)". I am passing an array of string with paths to my images which live in the documents directory. In debug, the url in appendPixelBufferForImageAtURL looks like: "/var/mobile/Containers/Data/Application/F845B860-A691-4CE0-ADB1-B7F8E883B9AE/Documents/07-02-2016%2003:22:33%20PM.png". Is this a correct format? It seems as though this is where my problem lies. You'll note I'm encoding the file name to remove the spaces.

Any help would be greatly appreciated!

@Baldman68
Copy link

Baldman68 commented Jul 5, 2016

OK I was able to solve the issue. In case anyone else runs into the same issue, I had to remove the encoding of the filename, and then changed the appendPixelBufferForImageAtURL as so:

func appendPixelBufferForImageAtURL(url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
    var appendSucceeded = false

    let fm = NSFileManager.defaultManager()
    let data:NSData? = NSData(data: fm.contentsAtPath(url)!)
    autoreleasepool {
      if let imageData = data,
        let image = UIImage(data: imageData),
        let pixelBufferPool = pixelBufferAdaptor.pixelBufferPool {
          let pixelBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.alloc(1)
          let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
            kCFAllocatorDefault,
            pixelBufferPool,
            pixelBufferPointer
          )

          if let pixelBuffer = pixelBufferPointer.memory where status == 0 {
            fillPixelBufferFromImage(image, pixelBuffer: pixelBuffer)

            appendSucceeded = pixelBufferAdaptor.appendPixelBuffer(
              pixelBuffer,
              withPresentationTime: presentationTime
            )

            pixelBufferPointer.destroy()
          } else {
            NSLog("error: Failed to allocate pixel buffer from pool")
          }

          pixelBufferPointer.dealloc(1)
      }
    }

    return appendSucceeded
  }

@stanchiang
Copy link

hey guys thanks for putting this together in swift; definitely saving me lots of time from trial and error. here's a link to a similar swift implementation done on stack overflow, just for comparison as I can confirm it worked for me in swift 2 / Xcode 7.3.
http://stackoverflow.com/a/36290742/1079379

@ggua5470
Copy link

ggua5470 commented Oct 7, 2016

I just made a Swift 3 working version (i.e. can generate the playable .mov video with no stretching). Logic is slightly changed such as passing in the videoOutputURL, image size resized to 640x640 (it needs to be 16x to make the video without distortion, right?), but the main video processing logic is not changed (I never used AVAssetWriter before).
So also need someone to check if this Swift 3 version is doing everything right since it converts lots of Swift 2 syntax to Swift 3 syntax, especially everything related to CVPixelBuffer.

import AVFoundation
import UIKit

let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1

class TimeLapseBuilder: NSObject {
    let photoURLs: [String]
    let videoOutputURL:URL
    var videoWriter: AVAssetWriter?

    init(photoURLs: [String], videoOutputURL:URL) {
        self.photoURLs = photoURLs
        self.videoOutputURL = videoOutputURL
    }

    func build(progress: @escaping ((Progress) -> Void), success: @escaping ((URL) -> Void), failure: ((Error) -> Void)) {
        let inputSize = CGSize(width: 640, height: 640)
        let outputSize = CGSize(width: 640, height: 640)
        var error: NSError?

        if(FileManager.default.fileExists(atPath: videoOutputURL.path)){
            do{
                try FileManager.default.removeItem(at: videoOutputURL)

                print("+++ OK deleting video file at \(videoOutputURL.path)")
            }catch let error as NSError {
                print("--- Ooops! Error deleting video file at \(videoOutputURL.path): \(error)")
            }
        }



        var videoWriter:AVAssetWriter?
        do {
            try
                videoWriter = AVAssetWriter(outputURL: videoOutputURL, fileType: AVFileTypeQuickTimeMovie)
        }
        catch let error as NSError {
            print("--- Ooops! Error creating AVAssetWriter: \(error)")
        }


        if let videoWriter = videoWriter {
            let videoSettings: [String : Any] = [
                AVVideoCodecKey  : AVVideoCodecH264,
                AVVideoWidthKey  : outputSize.width,
                AVVideoHeightKey : outputSize.height,
            ]


            let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)


            let sourcePixelBufferAttributes: [String : Any] = [
                kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_32ARGB),
                kCVPixelBufferWidthKey as String : Float(inputSize.width),
                kCVPixelBufferHeightKey as String : Float(inputSize.height),
                ]

            let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
                assetWriterInput: videoWriterInput,
                sourcePixelBufferAttributes: sourcePixelBufferAttributes
            )


            assert(videoWriter.canAdd(videoWriterInput))
            videoWriter.add(videoWriterInput)


            if videoWriter.startWriting() {

                videoWriter.startSession(atSourceTime: kCMTimeZero)
                assert(pixelBufferAdaptor.pixelBufferPool != nil)

                let media_queue = DispatchQueue(label: "mediaInputQueue") //Create a serial queue

                videoWriterInput.requestMediaDataWhenReady(on: media_queue, using: {
                    () -> Void in
                    let fps: Int32 = 1  //25

                    let currentProgress = Progress(totalUnitCount: Int64(self.photoURLs.count))

                    var frameCount: Int64 = 0
                    var remainingPhotoURLs = [String](self.photoURLs)

                    while (videoWriterInput.isReadyForMoreMediaData && !remainingPhotoURLs.isEmpty) {
                        let nextPhotoURL = remainingPhotoURLs.remove(at: 0)
                        let thisFrameTime = CMTimeMake(frameCount, fps)
                        let presentationTime = thisFrameTime

                        if !self.appendPixelBufferForImageAtURL(url: nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime) {
                            error = NSError(
                                domain: kErrorDomain,
                                code: kFailedToAppendPixelBufferError,
                                userInfo: [
                                    "description": "AVAssetWriterInputPixelBufferAdapter failed to append pixel buffer",
                                    "rawError": videoWriter.error != nil ? "\(videoWriter.error)" : "(none)"
                                ]
                            )

                            break
                        }

                        frameCount += 1


                        currentProgress.completedUnitCount = frameCount
                        progress(currentProgress)
                    }


                    videoWriterInput.markAsFinished()
                    videoWriter.finishWriting { () -> Void in
                        if error == nil {
                            success(self.videoOutputURL)
                        }
                    }
                })
            } else {
                error = NSError(
                    domain: kErrorDomain,
                    code: kFailedToStartAssetWriterError,
                    userInfo: ["description": "AVAssetWriter failed to start writing: \(videoWriter.error)"]
                )
                print("AVAssetWriter failed to start writing: \(videoWriter.error)")
            }
        }

        if let error = error {
            failure(error)
        }
    }

    func appendPixelBufferForImageAtURL(url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
        var appendSucceeded = true

        autoreleasepool {
            if let url = URL(string: url),
                let imageData = NSData(contentsOf: url),
                let image = UIImage(data: imageData as Data) {


                if let image = resizeImage(image: image, newWidth: 640){
                    let pixelBuffer: UnsafeMutablePointer<CVPixelBuffer?> = UnsafeMutablePointer<CVPixelBuffer?>.allocate(capacity: 1)
                    let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
                        kCFAllocatorDefault,
                        pixelBufferAdaptor.pixelBufferPool!,
                        pixelBuffer
                    )

                    if let pixelBuffer = pixelBuffer.pointee, status == 0 {

                        fillPixelBufferFromImage(image: image, pixelBuffer: pixelBuffer)

                        appendSucceeded = pixelBufferAdaptor.append(
                            pixelBuffer,
                            withPresentationTime: presentationTime
                        )
                    } else {
                        NSLog("error: Failed to allocate pixel buffer from pool")
                    }
                }
            }
        }

        return appendSucceeded
    }

    func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBuffer) {
        _ = CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

        let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue)
        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()

        let context = CGContext(
            data: pixelData,
            width: Int(image.size.width),
            height: Int(image.size.height),
            bitsPerComponent: 8,
            bytesPerRow: Int(4 * image.size.width),
            space: rgbColorSpace,
            bitmapInfo: bitmapInfo.rawValue
        )

        context?.draw(image.cgImage!, in:CGRect(x:0, y:0, width:image.size.width, height:image.size.height))

        CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
    }

    func resizeImage(image: UIImage, newWidth: CGFloat) -> UIImage? {
        let scale = newWidth / image.size.width
        let newHeight = image.size.height * scale
        UIGraphicsBeginImageContext(CGSize(width: newWidth, height: newHeight))
        image.draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
        let newImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

        return newImage
    }

@brunaaleixo
Copy link

brunaaleixo commented Nov 17, 2016

Hi, I have translated your code to swift 3. Thanks for sharing the original code. Here it is:

import AVFoundation
import UIKit

let kErrorDomain = "TimeLapseBuilder"
let kFailedToStartAssetWriterError = 0
let kFailedToAppendPixelBufferError = 1

class TimeLapseBuilder: NSObject {
    let photoURLs: [String]
    var videoWriter: AVAssetWriter?
    
    init(photoURLs: [String]) {
        self.photoURLs = photoURLs
    }
    
    func build(progress: @escaping ((Progress) -> Void), success: @escaping ((NSURL) -> Void), failure: ((NSError) -> Void)) {
        let inputSize = CGSize(width: 4000, height: 3000)
        let outputSize = CGSize(width: 1280, height: 720)
        var error: NSError?
        
        let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
        let videoOutputURL = NSURL(fileURLWithPath: documentsPath.appendingPathComponent("AssembledVideo.mov"))
        
        do {
            try FileManager.default.removeItem(at: videoOutputURL as URL)
        } catch {}
        
        do {
            try videoWriter = AVAssetWriter(outputURL: videoOutputURL as URL, fileType: AVFileTypeQuickTimeMovie)
        } catch let writerError as NSError {
            error = writerError
            videoWriter = nil
        }
        
        if let videoWriter = videoWriter {
            let videoSettings: [String : AnyObject] = [
                AVVideoCodecKey  : AVVideoCodecH264 as AnyObject,
                AVVideoWidthKey  : outputSize.width as AnyObject,
                AVVideoHeightKey : outputSize.height as AnyObject,
                //        AVVideoCompressionPropertiesKey : [
                //          AVVideoAverageBitRateKey : NSInteger(1000000),
                //          AVVideoMaxKeyFrameIntervalKey : NSInteger(16),
                //          AVVideoProfileLevelKey : AVVideoProfileLevelH264BaselineAutoLevel
                //        ]
            ]
            
            let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
            
        
            let sourceBufferAttributes = [
                (kCVPixelBufferPixelFormatTypeKey as String): Int(kCVPixelFormatType_32ARGB),
                (kCVPixelBufferWidthKey as String): Float(inputSize.width),
                (kCVPixelBufferHeightKey as String): Float(inputSize.height)] as [String : Any]
            
            let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(
                assetWriterInput: videoWriterInput,
                sourcePixelBufferAttributes: sourceBufferAttributes
            )
            
            assert(videoWriter.canAdd(videoWriterInput))
            videoWriter.add(videoWriterInput)
            
            if videoWriter.startWriting() {
                videoWriter.startSession(atSourceTime: kCMTimeZero)
                assert(pixelBufferAdaptor.pixelBufferPool != nil)
                
                let media_queue = DispatchQueue(label: "mediaInputQueue")
                
                videoWriterInput.requestMediaDataWhenReady(on: media_queue, using: { () -> Void in
                    let fps: Int32 = 30
                    let frameDuration = CMTimeMake(1, fps)
                    let currentProgress = Progress(totalUnitCount: Int64(self.photoURLs.count))
                    
                    var frameCount: Int64 = 0
                    var remainingPhotoURLs = [String](self.photoURLs)
                    
                    while (videoWriterInput.isReadyForMoreMediaData && !remainingPhotoURLs.isEmpty) {
                        let nextPhotoURL = remainingPhotoURLs.remove(at: 0)
                        let lastFrameTime = CMTimeMake(frameCount, fps)
                        let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
                        
                        
                        self.appendPixelBufferForImageAtURL(url: nextPhotoURL, pixelBufferAdaptor: pixelBufferAdaptor, presentationTime: presentationTime)
                   
            
                        frameCount += 1
                        
                        currentProgress.completedUnitCount = frameCount
                        progress(currentProgress)
                    }
                    
                    videoWriterInput.markAsFinished()
                    videoWriter.finishWriting { () -> Void in
                        if error == nil {
                            success(videoOutputURL)
                        }
                        
                        self.videoWriter = nil
                    }
                })
            } else {
                error = NSError(
                    domain: kErrorDomain,
                    code: kFailedToStartAssetWriterError,
                    userInfo: ["description": "AVAssetWriter failed to start writing"]
                )
            }
        }
        
        if let error = error {
            failure(error)
        }
    }
    
    func appendPixelBufferForImageAtURL(url: String, pixelBufferAdaptor: AVAssetWriterInputPixelBufferAdaptor, presentationTime: CMTime) -> Bool {
        var appendSucceeded = false
        
        autoreleasepool {
            if let url = NSURL(string: url),
                let imageData = NSData(contentsOf: url as URL),
                let image = UIImage(data: imageData as Data),
                let pixelBufferPool = pixelBufferAdaptor.pixelBufferPool {
                let pixelBufferPointer = UnsafeMutablePointer<CVPixelBuffer?>.allocate(capacity: 1)
                let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(
                    kCFAllocatorDefault,
                    pixelBufferPool,
                    pixelBufferPointer
                )
                
                let pixelBuffer = pixelBufferPointer.pointee
                
                if pixelBuffer != nil && status == 0 {
                    fillPixelBufferFromImage(image: image, pixelBuffer: pixelBuffer!)
                    
                    appendSucceeded = pixelBufferAdaptor.append(
                        pixelBuffer!,
                        withPresentationTime: presentationTime
                    )
                    
                    pixelBufferPointer.deinitialize()
                } else {
                    NSLog("error: Failed to allocate pixel buffer from pool")
                }
                
                pixelBufferPointer.deallocate(capacity: 1)
            }
        }
        
        return appendSucceeded
    }
    
    func fillPixelBufferFromImage(image: UIImage, pixelBuffer: CVPixelBuffer) {
        CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
        
        let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer)
        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        
        let context = CGContext(
            data: pixelData,
            width: Int(image.size.width),
            height: Int(image.size.height),
            bitsPerComponent: 8,
            bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer),
            space: rgbColorSpace,
            bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue
        )
        
        let rect = CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height)
        context?.draw(image.cgImage!, in: rect)
        
        CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
    }
}

@acj
Copy link
Author

acj commented Nov 21, 2016

I've posted an updated version of TimeLapseBuilder.swift that uses the Swift 3 syntax. Thanks to @ggua5470, @brunaaleixo, and others for stepping up.

@jacobokoenig
Copy link

Lifesaver!!!!!

@seyhagithub
Copy link

hello everyone, my project is streaming jpeg frames from api, api response as image. so i want to recorded as video. How can I do with this source code because i see it accept string as path. please help me and thank you in advance

@acj
Copy link
Author

acj commented Feb 23, 2017

@seyhagithub Hi there. If you're receiving raw jpeg frames, you could adapt the appendPixelBufferForImageAtURL(url:pixelBufferAdapter:presentationTime:) method to take an NSData or UIImage instead. Most of the work would be removing the code that makes an NSData object for each URL.

@norsez
Copy link

norsez commented Mar 5, 2017

Thanks for this!

@seanmcneil
Copy link

seanmcneil commented Mar 9, 2017

If anyone is interested, I used acj's Swift 3.0 work as a basis to create a Cocoapod for writing images out to videos that you can install via pod Spitfire. Thanks to acj & others who contributed for helping me out. Figured this could be a good way to pay back :)

@Salman-Majid
Copy link

Guys, i used this project to build my application that takes some images and a music file and then convert them to video. But the images don't have cool animations. They just show and disappear. I want to apply animations to them. I have done a little research about CALayer class but i found nothing about adding animations to video in it. Can any body help me so that i can add desired animations to individual photos. That will be great. Thanks

@acj
Copy link
Author

acj commented Mar 12, 2017

@seanmcneil nice! Thanks for sharing.

@Salman-Majid I recommend looking at AVMutableComposition. You can use it to mix multiple tracks, including Core Animation layers. There are a few tutorials and WWDC videos that cover the details. Please refer any followup questions to Stack Overflow. Good luck!

@acj
Copy link
Author

acj commented Mar 12, 2017

This code has moved

Please use TimeLapseBuilder-Swift on GitHub instead.

Thank you, everyone, for your kind feedback, comments, and contributions. TimeLapseBuilder has outgrown this gist, and it's gotten difficult to manage the code and conversation. I've created a repository that should be used instead. Cheers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment