TECH | Hardware-accelerated H.264 decoding for iPhone (finally)

10 Oct
Although referenced throughout the web, the original cannot be downloaded anywhere else; regardless, it didn’t work.In spite of around-the-clock violence of the demonic kind today, I somehow managed to take what is probably the only copy of Apple’s non-functioning sample project, VideoTimeline, and make it work—for sure making it the only copy that does. So, nerd that, demon trash!

A quick rewrite of the VideoTimeline sample app from Apple Developer Connection, to make it actually work; it uses AVFoundation for reading video data, VideoToolbox for decompressing sample buffer, and OpenGL for post-processing

Although referenced throughout the web, the original cannot be downloaded anywhere else; and, even where it can be found, the project files compile, but don’t run on iPhone or an iPad simulator. Because there is virtually no documentation or working samples of the VideoToolbox API, developers looking to add the low-level video encoding/decoding functionality provided by the API made the highly touted VideoTimeline sample project much sought after.

Unfortunately, none of the sites that purport to link to the sample project do; they are all broken. Some developers purport to have found a copy using a very old link to the project on Apple’s site; but, none seem to actually have the project. I somehow managed to find such a link; but, in order to actually download it, I had to perform an elaborate download dance with Safari, which required me to authenticate to the Apple Developer Connection as many as three times in a row, and required me to restart my browser somewhere in between. I’m not joking.

That’s how I obtained my first copy of the project, and when I began to suspect that a working version did not exist anywhere—even at Apple. I confirmed that suspicion later, after having Apple Developer Support for the latest version of the sample project. They, too, sent me the same, non-working project files, purporting to have no other.

Here, then, to the best of my knowledge, is the only working version:

DOWNLOAD | VideoTimeline (OpenGL 3.0 version)

The scope of the sample project is de minimus, merely demonstrating how to route sample buffer data output by AVAssetReader through whatever hardware on an iPhone accelerates decoding. The advantages of that are obvious when it comes to real-time video playback, but might be a pointless demonstration when used in conjunction with AVAssetReader (which may or may not already use that same hardware, and which I know for a fact already decodes sample buffers—and, if it doesn’t, the simple addition of AVSampleBufferDisplayLayer does).

Regardless, this should give developer’s an entry point into VideoToolbox, and also give them a good idea of its complexity and performance—and an idea of just how much they don’t know, but need to know.

NOTE | Good luck climbing that mountain; watch the temperature on your iPhone at about the 15-second mark of any playing video. You’ve got the GPU and the other piece of hardware (?) cooking at the same time.

This project was originally intended to be used in conjunction with the Session #514 of the 2014 WWDC. My version is nearly identical to the two-year old original, but omits UIImagePickerController, playing instead the first video in the Videos folder of the Photos app on iPhone; it also renders video using OpenGL 3.0—not 2.0. Also removed is the method that displays each frame in a separate layer, being an unnecessary, although pretty cool effect. Those things can be easily added back; but, without them, the project is closer to what someone would actually use in a production implementation. What I didn’t do is remove the method causing a memory leak, namely, the one displaying the timecode overlay on the video window. Simply remove it and its call from the CAEAGLLayer subclass to even memory comsumption to a stable 11 MB and holding.

The relevance to a demoniac (and otherwise)
This effort was made in order to (eventually) add hardware-accelerated H.264 decoding to Chroma, the demon-finding app now in-progress; by undertaking this exercise, I can now successfully do just that without the overhead of excess and unnecessary resources consumed by a higher-level API such as AVFoundation. That is essential since Chroma processes video in real-time while users make edits, both while shooting a new video or playing a stored one. Chroma already accomplishes this without VideoToolbox, but makes it an overly aggressive consumer of resources to the detriment of other apps on iPhone. Unneighborly is not good on an already resource-strapped device.

Leave a comment

Posted by on October 10, 2016 in Uncategorized


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: