How to implement video capture on the mac


Photo by pt

If you want to capture the data from an iSight (or any other video camera) on OS X, figuring out where to start is tough. Video input and output are both controlled by QuickTime, an amazingly successful framework, but as a long-lived interface to rapidly changing hardware it has accumulated an inpenetrable thicket of APIs. That means there's no obvious StartVideoCapture() function, instead you have to use some odd legacy calls.

Here's the official SGDataProcSample code demonstrating video capture to a data buffer. The actual source you want is in MiniMung.c, appropriately named after the joke acronym Mung Until No Good. Don't try to make too much sense of the actual functions, just accept that these are the magical incantations you need to mutter to get it working.

If you want an example of uploading the captured data to an OpenGL texture, you can download the source code to my Live Feed video plugin for Motion and look in for the code. I have a separate thread running capturing the video and downloading it to a buffer, while the rendering thread constantly uploads a texture from it. There's a risk of tearing, but keeps the logic simple and doesn't require any blocking.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: